Oct 10 09:04:58 localhost kernel: Linux version 5.14.0-621.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025
Oct 10 09:04:58 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 10 09:04:58 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 09:04:58 localhost kernel: BIOS-provided physical RAM map:
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 10 09:04:58 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 10 09:04:58 localhost kernel: NX (Execute Disable) protection: active
Oct 10 09:04:58 localhost kernel: APIC: Static calls initialized
Oct 10 09:04:58 localhost kernel: SMBIOS 2.8 present.
Oct 10 09:04:58 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 10 09:04:58 localhost kernel: Hypervisor detected: KVM
Oct 10 09:04:58 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 10 09:04:58 localhost kernel: kvm-clock: using sched offset of 4561135004 cycles
Oct 10 09:04:58 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 10 09:04:58 localhost kernel: tsc: Detected 2800.000 MHz processor
Oct 10 09:04:58 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 10 09:04:58 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 10 09:04:58 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 10 09:04:58 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 10 09:04:58 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 10 09:04:58 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 10 09:04:58 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 10 09:04:58 localhost kernel: Using GB pages for direct mapping
Oct 10 09:04:58 localhost kernel: RAMDISK: [mem 0x2d858000-0x32c23fff]
Oct 10 09:04:58 localhost kernel: ACPI: Early table checksum verification disabled
Oct 10 09:04:58 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 10 09:04:58 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 09:04:58 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 09:04:58 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 09:04:58 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 10 09:04:58 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 09:04:58 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 09:04:58 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 10 09:04:58 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 10 09:04:58 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 10 09:04:58 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 10 09:04:58 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 10 09:04:58 localhost kernel: No NUMA configuration found
Oct 10 09:04:58 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 10 09:04:58 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 10 09:04:58 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 10 09:04:58 localhost kernel: Zone ranges:
Oct 10 09:04:58 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 10 09:04:58 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 10 09:04:58 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 10 09:04:58 localhost kernel:   Device   empty
Oct 10 09:04:58 localhost kernel: Movable zone start for each node
Oct 10 09:04:58 localhost kernel: Early memory node ranges
Oct 10 09:04:58 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 10 09:04:58 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 10 09:04:58 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 10 09:04:58 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 10 09:04:58 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 10 09:04:58 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 10 09:04:58 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 10 09:04:58 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 10 09:04:58 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 10 09:04:58 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 10 09:04:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 10 09:04:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 10 09:04:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 10 09:04:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 10 09:04:58 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 10 09:04:58 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 10 09:04:58 localhost kernel: TSC deadline timer available
Oct 10 09:04:58 localhost kernel: CPU topo: Max. logical packages:   8
Oct 10 09:04:58 localhost kernel: CPU topo: Max. logical dies:       8
Oct 10 09:04:58 localhost kernel: CPU topo: Max. dies per package:   1
Oct 10 09:04:58 localhost kernel: CPU topo: Max. threads per core:   1
Oct 10 09:04:58 localhost kernel: CPU topo: Num. cores per package:     1
Oct 10 09:04:58 localhost kernel: CPU topo: Num. threads per package:   1
Oct 10 09:04:58 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 10 09:04:58 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 10 09:04:58 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 10 09:04:58 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 10 09:04:58 localhost kernel: Booting paravirtualized kernel on KVM
Oct 10 09:04:58 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 10 09:04:58 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 10 09:04:58 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 10 09:04:58 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 10 09:04:58 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 10 09:04:58 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 10 09:04:58 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 09:04:58 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64", will be passed to user space.
Oct 10 09:04:58 localhost kernel: random: crng init done
Oct 10 09:04:58 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 10 09:04:58 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 10 09:04:58 localhost kernel: Fallback order for Node 0: 0 
Oct 10 09:04:58 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 10 09:04:58 localhost kernel: Policy zone: Normal
Oct 10 09:04:58 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 10 09:04:58 localhost kernel: software IO TLB: area num 8.
Oct 10 09:04:58 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 10 09:04:58 localhost kernel: ftrace: allocating 49162 entries in 193 pages
Oct 10 09:04:58 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 10 09:04:58 localhost kernel: Dynamic Preempt: voluntary
Oct 10 09:04:58 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 10 09:04:58 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 10 09:04:58 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 10 09:04:58 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 10 09:04:58 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 10 09:04:58 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 10 09:04:58 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 10 09:04:58 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 10 09:04:58 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 09:04:58 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 09:04:58 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 09:04:58 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 10 09:04:58 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 10 09:04:58 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 10 09:04:58 localhost kernel: Console: colour VGA+ 80x25
Oct 10 09:04:58 localhost kernel: printk: console [ttyS0] enabled
Oct 10 09:04:58 localhost kernel: ACPI: Core revision 20230331
Oct 10 09:04:58 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 10 09:04:58 localhost kernel: x2apic enabled
Oct 10 09:04:58 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 10 09:04:58 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 10 09:04:58 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 10 09:04:58 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 10 09:04:58 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 10 09:04:58 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 10 09:04:58 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 10 09:04:58 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 10 09:04:58 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 10 09:04:58 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 10 09:04:58 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 10 09:04:58 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 10 09:04:58 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 10 09:04:58 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 10 09:04:58 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 10 09:04:58 localhost kernel: x86/bugs: return thunk changed
Oct 10 09:04:58 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 10 09:04:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 10 09:04:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 10 09:04:58 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 10 09:04:58 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 10 09:04:58 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 10 09:04:58 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 10 09:04:58 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 10 09:04:58 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 10 09:04:58 localhost kernel: landlock: Up and running.
Oct 10 09:04:58 localhost kernel: Yama: becoming mindful.
Oct 10 09:04:58 localhost kernel: SELinux:  Initializing.
Oct 10 09:04:58 localhost kernel: LSM support for eBPF active
Oct 10 09:04:58 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 10 09:04:58 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 10 09:04:58 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 10 09:04:58 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 10 09:04:58 localhost kernel: ... version:                0
Oct 10 09:04:58 localhost kernel: ... bit width:              48
Oct 10 09:04:58 localhost kernel: ... generic registers:      6
Oct 10 09:04:58 localhost kernel: ... value mask:             0000ffffffffffff
Oct 10 09:04:58 localhost kernel: ... max period:             00007fffffffffff
Oct 10 09:04:58 localhost kernel: ... fixed-purpose events:   0
Oct 10 09:04:58 localhost kernel: ... event mask:             000000000000003f
Oct 10 09:04:58 localhost kernel: signal: max sigframe size: 1776
Oct 10 09:04:58 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 10 09:04:58 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 10 09:04:58 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 10 09:04:58 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 10 09:04:58 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 10 09:04:58 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 10 09:04:58 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 10 09:04:58 localhost kernel: node 0 deferred pages initialised in 15ms
Oct 10 09:04:58 localhost kernel: Memory: 7765864K/8388068K available (16384K kernel code, 5784K rwdata, 13864K rodata, 4188K init, 7196K bss, 616208K reserved, 0K cma-reserved)
Oct 10 09:04:58 localhost kernel: devtmpfs: initialized
Oct 10 09:04:58 localhost kernel: x86/mm: Memory block size: 128MB
Oct 10 09:04:58 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 10 09:04:58 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 10 09:04:58 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 10 09:04:58 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 10 09:04:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 10 09:04:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 10 09:04:58 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 10 09:04:58 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 10 09:04:58 localhost kernel: audit: type=2000 audit(1760087096.570:1): state=initialized audit_enabled=0 res=1
Oct 10 09:04:58 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 10 09:04:58 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 10 09:04:58 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 10 09:04:58 localhost kernel: cpuidle: using governor menu
Oct 10 09:04:58 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 10 09:04:58 localhost kernel: PCI: Using configuration type 1 for base access
Oct 10 09:04:58 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 10 09:04:58 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 10 09:04:58 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 10 09:04:58 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 10 09:04:58 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 10 09:04:58 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 10 09:04:58 localhost kernel: Demotion targets for Node 0: null
Oct 10 09:04:58 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 10 09:04:58 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 10 09:04:58 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 10 09:04:58 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 10 09:04:58 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 10 09:04:58 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 10 09:04:58 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 10 09:04:58 localhost kernel: ACPI: Interpreter enabled
Oct 10 09:04:58 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 10 09:04:58 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 10 09:04:58 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 10 09:04:58 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 10 09:04:58 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 10 09:04:58 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 10 09:04:58 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [3] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [4] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [5] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [6] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [7] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [8] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [9] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [10] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [11] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [12] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [13] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [14] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [15] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [16] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [17] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [18] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [19] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [20] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [21] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [22] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [23] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [24] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [25] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [26] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [27] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [28] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [29] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [30] registered
Oct 10 09:04:58 localhost kernel: acpiphp: Slot [31] registered
Oct 10 09:04:58 localhost kernel: PCI host bridge to bus 0000:00
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 10 09:04:58 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 10 09:04:58 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 10 09:04:58 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 10 09:04:58 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 10 09:04:58 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 10 09:04:58 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 10 09:04:58 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 10 09:04:58 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 10 09:04:58 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 10 09:04:58 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 10 09:04:58 localhost kernel: iommu: Default domain type: Translated
Oct 10 09:04:58 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 10 09:04:58 localhost kernel: SCSI subsystem initialized
Oct 10 09:04:58 localhost kernel: ACPI: bus type USB registered
Oct 10 09:04:58 localhost kernel: usbcore: registered new interface driver usbfs
Oct 10 09:04:58 localhost kernel: usbcore: registered new interface driver hub
Oct 10 09:04:58 localhost kernel: usbcore: registered new device driver usb
Oct 10 09:04:58 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 10 09:04:58 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 10 09:04:58 localhost kernel: PTP clock support registered
Oct 10 09:04:58 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 10 09:04:58 localhost kernel: NetLabel: Initializing
Oct 10 09:04:58 localhost kernel: NetLabel:  domain hash size = 128
Oct 10 09:04:58 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 10 09:04:58 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 10 09:04:58 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 10 09:04:58 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 10 09:04:58 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 10 09:04:58 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 10 09:04:58 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 10 09:04:58 localhost kernel: vgaarb: loaded
Oct 10 09:04:58 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 10 09:04:58 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 10 09:04:58 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 10 09:04:58 localhost kernel: pnp: PnP ACPI init
Oct 10 09:04:58 localhost kernel: pnp 00:03: [dma 2]
Oct 10 09:04:58 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 10 09:04:58 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 10 09:04:58 localhost kernel: NET: Registered PF_INET protocol family
Oct 10 09:04:58 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 10 09:04:58 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 10 09:04:58 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 10 09:04:58 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 10 09:04:58 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 10 09:04:58 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 10 09:04:58 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 10 09:04:58 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 10 09:04:58 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 10 09:04:58 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 10 09:04:58 localhost kernel: NET: Registered PF_XDP protocol family
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 10 09:04:58 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 10 09:04:58 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 10 09:04:58 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 10 09:04:58 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 82018 usecs
Oct 10 09:04:58 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 10 09:04:58 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 10 09:04:58 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 10 09:04:58 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 10 09:04:58 localhost kernel: ACPI: bus type thunderbolt registered
Oct 10 09:04:58 localhost kernel: Initialise system trusted keyrings
Oct 10 09:04:58 localhost kernel: Key type blacklist registered
Oct 10 09:04:58 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 10 09:04:58 localhost kernel: zbud: loaded
Oct 10 09:04:58 localhost kernel: integrity: Platform Keyring initialized
Oct 10 09:04:58 localhost kernel: integrity: Machine keyring initialized
Oct 10 09:04:58 localhost kernel: Freeing initrd memory: 85808K
Oct 10 09:04:58 localhost kernel: NET: Registered PF_ALG protocol family
Oct 10 09:04:58 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 10 09:04:58 localhost kernel: Key type asymmetric registered
Oct 10 09:04:58 localhost kernel: Asymmetric key parser 'x509' registered
Oct 10 09:04:58 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 10 09:04:58 localhost kernel: io scheduler mq-deadline registered
Oct 10 09:04:58 localhost kernel: io scheduler kyber registered
Oct 10 09:04:58 localhost kernel: io scheduler bfq registered
Oct 10 09:04:58 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 10 09:04:58 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 10 09:04:58 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 10 09:04:58 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 10 09:04:58 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 10 09:04:58 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 10 09:04:58 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 10 09:04:58 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 10 09:04:58 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 10 09:04:58 localhost kernel: Non-volatile memory driver v1.3
Oct 10 09:04:58 localhost kernel: rdac: device handler registered
Oct 10 09:04:58 localhost kernel: hp_sw: device handler registered
Oct 10 09:04:58 localhost kernel: emc: device handler registered
Oct 10 09:04:58 localhost kernel: alua: device handler registered
Oct 10 09:04:58 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 10 09:04:58 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 10 09:04:58 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 10 09:04:58 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 10 09:04:58 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 10 09:04:58 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 10 09:04:58 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 10 09:04:58 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-621.el9.x86_64 uhci_hcd
Oct 10 09:04:58 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 10 09:04:58 localhost kernel: hub 1-0:1.0: USB hub found
Oct 10 09:04:58 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 10 09:04:58 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 10 09:04:58 localhost kernel: usbserial: USB Serial support registered for generic
Oct 10 09:04:58 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 10 09:04:58 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 10 09:04:58 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 10 09:04:58 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 10 09:04:58 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 10 09:04:58 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 10 09:04:58 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 10 09:04:58 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-10T09:04:57 UTC (1760087097)
Oct 10 09:04:58 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 10 09:04:58 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 10 09:04:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 10 09:04:58 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 10 09:04:58 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 10 09:04:58 localhost kernel: usbcore: registered new interface driver usbhid
Oct 10 09:04:58 localhost kernel: usbhid: USB HID core driver
Oct 10 09:04:58 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 10 09:04:58 localhost kernel: Initializing XFRM netlink socket
Oct 10 09:04:58 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 10 09:04:58 localhost kernel: Segment Routing with IPv6
Oct 10 09:04:58 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 10 09:04:58 localhost kernel: mpls_gso: MPLS GSO support
Oct 10 09:04:58 localhost kernel: IPI shorthand broadcast: enabled
Oct 10 09:04:58 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 10 09:04:58 localhost kernel: AES CTR mode by8 optimization enabled
Oct 10 09:04:58 localhost kernel: sched_clock: Marking stable (1148024068, 149235727)->(1414112165, -116852370)
Oct 10 09:04:58 localhost kernel: registered taskstats version 1
Oct 10 09:04:58 localhost kernel: Loading compiled-in X.509 certificates
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 72f99a463516b0dfb027e50caab189f607ef1bc9'
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 10 09:04:58 localhost kernel: Demotion targets for Node 0: null
Oct 10 09:04:58 localhost kernel: page_owner is disabled
Oct 10 09:04:58 localhost kernel: Key type .fscrypt registered
Oct 10 09:04:58 localhost kernel: Key type fscrypt-provisioning registered
Oct 10 09:04:58 localhost kernel: Key type big_key registered
Oct 10 09:04:58 localhost kernel: Key type encrypted registered
Oct 10 09:04:58 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 10 09:04:58 localhost kernel: Loading compiled-in module X.509 certificates
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 72f99a463516b0dfb027e50caab189f607ef1bc9'
Oct 10 09:04:58 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 10 09:04:58 localhost kernel: ima: No architecture policies found
Oct 10 09:04:58 localhost kernel: evm: Initialising EVM extended attributes:
Oct 10 09:04:58 localhost kernel: evm: security.selinux
Oct 10 09:04:58 localhost kernel: evm: security.SMACK64 (disabled)
Oct 10 09:04:58 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 10 09:04:58 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 10 09:04:58 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 10 09:04:58 localhost kernel: evm: security.apparmor (disabled)
Oct 10 09:04:58 localhost kernel: evm: security.ima
Oct 10 09:04:58 localhost kernel: evm: security.capability
Oct 10 09:04:58 localhost kernel: evm: HMAC attrs: 0x1
Oct 10 09:04:58 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 10 09:04:58 localhost kernel: Running certificate verification RSA selftest
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 10 09:04:58 localhost kernel: Running certificate verification ECDSA selftest
Oct 10 09:04:58 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 10 09:04:58 localhost kernel: clk: Disabling unused clocks
Oct 10 09:04:58 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 10 09:04:58 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Oct 10 09:04:58 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 10 09:04:58 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 472K
Oct 10 09:04:58 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 10 09:04:58 localhost kernel: Run /init as init process
Oct 10 09:04:58 localhost kernel:   with arguments:
Oct 10 09:04:58 localhost kernel:     /init
Oct 10 09:04:58 localhost kernel:   with environment:
Oct 10 09:04:58 localhost kernel:     HOME=/
Oct 10 09:04:58 localhost kernel:     TERM=linux
Oct 10 09:04:58 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64
Oct 10 09:04:58 localhost systemd[1]: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 10 09:04:58 localhost systemd[1]: Detected virtualization kvm.
Oct 10 09:04:58 localhost systemd[1]: Detected architecture x86-64.
Oct 10 09:04:58 localhost systemd[1]: Running in initrd.
Oct 10 09:04:58 localhost systemd[1]: No hostname configured, using default hostname.
Oct 10 09:04:58 localhost systemd[1]: Hostname set to <localhost>.
Oct 10 09:04:58 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 10 09:04:58 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 10 09:04:58 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 10 09:04:58 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 10 09:04:58 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 10 09:04:58 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 10 09:04:58 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 10 09:04:58 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 10 09:04:58 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 10 09:04:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 10 09:04:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 10 09:04:58 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 10 09:04:58 localhost systemd[1]: Reached target Local File Systems.
Oct 10 09:04:58 localhost systemd[1]: Reached target Path Units.
Oct 10 09:04:58 localhost systemd[1]: Reached target Slice Units.
Oct 10 09:04:58 localhost systemd[1]: Reached target Swaps.
Oct 10 09:04:58 localhost systemd[1]: Reached target Timer Units.
Oct 10 09:04:58 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 10 09:04:58 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 10 09:04:58 localhost systemd[1]: Listening on Journal Socket.
Oct 10 09:04:58 localhost systemd[1]: Listening on udev Control Socket.
Oct 10 09:04:58 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 10 09:04:58 localhost systemd[1]: Reached target Socket Units.
Oct 10 09:04:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 10 09:04:58 localhost systemd[1]: Starting Journal Service...
Oct 10 09:04:58 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 10 09:04:58 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 10 09:04:58 localhost systemd[1]: Starting Create System Users...
Oct 10 09:04:58 localhost systemd[1]: Starting Setup Virtual Console...
Oct 10 09:04:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 10 09:04:58 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 10 09:04:58 localhost systemd[1]: Finished Create System Users.
Oct 10 09:04:58 localhost systemd-journald[305]: Journal started
Oct 10 09:04:58 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/55d065af02524401ad6e822a36bead06) is 8.0M, max 153.6M, 145.6M free.
Oct 10 09:04:58 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Oct 10 09:04:58 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Oct 10 09:04:58 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 10 09:04:58 localhost systemd[1]: Started Journal Service.
Oct 10 09:04:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 10 09:04:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 10 09:04:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 10 09:04:58 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 10 09:04:58 localhost systemd[1]: Finished Setup Virtual Console.
Oct 10 09:04:58 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 10 09:04:58 localhost systemd[1]: Starting dracut cmdline hook...
Oct 10 09:04:58 localhost dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Oct 10 09:04:58 localhost dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 09:04:58 localhost systemd[1]: Finished dracut cmdline hook.
Oct 10 09:04:58 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 10 09:04:58 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 10 09:04:58 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 10 09:04:58 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 10 09:04:58 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 10 09:04:58 localhost kernel: RPC: Registered udp transport module.
Oct 10 09:04:58 localhost kernel: RPC: Registered tcp transport module.
Oct 10 09:04:58 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 10 09:04:58 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 10 09:04:58 localhost rpc.statd[439]: Version 2.5.4 starting
Oct 10 09:04:58 localhost rpc.statd[439]: Initializing NSM state
Oct 10 09:04:58 localhost rpc.idmapd[444]: Setting log level to 0
Oct 10 09:04:58 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 10 09:04:58 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 10 09:04:58 localhost systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Oct 10 09:04:58 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 10 09:04:58 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 10 09:04:58 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 10 09:04:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 10 09:04:58 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 10 09:04:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 10 09:04:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 09:04:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 10 09:04:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 10 09:04:58 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 10 09:04:58 localhost systemd[1]: Reached target Network.
Oct 10 09:04:58 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 10 09:04:58 localhost systemd[1]: Starting dracut initqueue hook...
Oct 10 09:04:58 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 10 09:04:58 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 10 09:04:58 localhost kernel:  vda: vda1
Oct 10 09:04:58 localhost kernel: libata version 3.00 loaded.
Oct 10 09:04:58 localhost systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:04:58 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 10 09:04:58 localhost kernel: scsi host0: ata_piix
Oct 10 09:04:58 localhost kernel: scsi host1: ata_piix
Oct 10 09:04:58 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 10 09:04:58 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 10 09:04:58 localhost systemd[1]: Found device /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3.
Oct 10 09:04:59 localhost systemd[1]: Reached target Initrd Root Device.
Oct 10 09:04:59 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 10 09:04:59 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 10 09:04:59 localhost systemd[1]: Reached target System Initialization.
Oct 10 09:04:59 localhost kernel: ata1: found unknown device (class 0)
Oct 10 09:04:59 localhost systemd[1]: Reached target Basic System.
Oct 10 09:04:59 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 10 09:04:59 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 10 09:04:59 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 10 09:04:59 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 10 09:04:59 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 10 09:04:59 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 10 09:04:59 localhost systemd[1]: Finished dracut initqueue hook.
Oct 10 09:04:59 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 10 09:04:59 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 10 09:04:59 localhost systemd[1]: Reached target Remote File Systems.
Oct 10 09:04:59 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 10 09:04:59 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 10 09:04:59 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3...
Oct 10 09:04:59 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Oct 10 09:04:59 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3.
Oct 10 09:04:59 localhost systemd[1]: Mounting /sysroot...
Oct 10 09:04:59 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 10 09:04:59 localhost kernel: XFS (vda1): Mounting V5 Filesystem 9839e2e1-98a2-4594-b609-79d514deb0a3
Oct 10 09:04:59 localhost kernel: XFS (vda1): Ending clean mount
Oct 10 09:04:59 localhost systemd[1]: Mounted /sysroot.
Oct 10 09:04:59 localhost systemd[1]: Reached target Initrd Root File System.
Oct 10 09:04:59 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 10 09:04:59 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 10 09:04:59 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 10 09:04:59 localhost systemd[1]: Reached target Initrd File Systems.
Oct 10 09:04:59 localhost systemd[1]: Reached target Initrd Default Target.
Oct 10 09:04:59 localhost systemd[1]: Starting dracut mount hook...
Oct 10 09:04:59 localhost systemd[1]: Finished dracut mount hook.
Oct 10 09:04:59 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 10 09:04:59 localhost rpc.idmapd[444]: exiting on signal 15
Oct 10 09:05:00 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 10 09:05:00 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 10 09:05:00 localhost systemd[1]: Stopped target Network.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Timer Units.
Oct 10 09:05:00 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 10 09:05:00 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Basic System.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Path Units.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Remote File Systems.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Slice Units.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Socket Units.
Oct 10 09:05:00 localhost systemd[1]: Stopped target System Initialization.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Local File Systems.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Swaps.
Oct 10 09:05:00 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut mount hook.
Oct 10 09:05:00 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 10 09:05:00 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 10 09:05:00 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 10 09:05:00 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 10 09:05:00 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 10 09:05:00 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 10 09:05:00 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 10 09:05:00 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 10 09:05:00 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 10 09:05:00 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 10 09:05:00 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 10 09:05:00 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 10 09:05:00 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Closed udev Control Socket.
Oct 10 09:05:00 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Closed udev Kernel Socket.
Oct 10 09:05:00 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 10 09:05:00 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 10 09:05:00 localhost systemd[1]: Starting Cleanup udev Database...
Oct 10 09:05:00 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 10 09:05:00 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 10 09:05:00 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Stopped Create System Users.
Oct 10 09:05:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 10 09:05:00 localhost systemd[1]: Finished Cleanup udev Database.
Oct 10 09:05:00 localhost systemd[1]: Reached target Switch Root.
Oct 10 09:05:00 localhost systemd[1]: Starting Switch Root...
Oct 10 09:05:00 localhost systemd[1]: Switching root.
Oct 10 09:05:00 localhost systemd-journald[305]: Journal stopped
Oct 10 09:05:01 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Oct 10 09:05:01 localhost kernel: audit: type=1404 audit(1760087100.288:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability open_perms=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:05:01 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:05:01 localhost kernel: audit: type=1403 audit(1760087100.425:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 10 09:05:01 localhost systemd[1]: Successfully loaded SELinux policy in 142.114ms.
Oct 10 09:05:01 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.379ms.
Oct 10 09:05:01 localhost systemd[1]: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 10 09:05:01 localhost systemd[1]: Detected virtualization kvm.
Oct 10 09:05:01 localhost systemd[1]: Detected architecture x86-64.
Oct 10 09:05:01 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:05:01 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Stopped Switch Root.
Oct 10 09:05:01 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 10 09:05:01 localhost systemd[1]: Created slice Slice /system/getty.
Oct 10 09:05:01 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 10 09:05:01 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 10 09:05:01 localhost systemd[1]: Created slice User and Session Slice.
Oct 10 09:05:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 10 09:05:01 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 10 09:05:01 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 10 09:05:01 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 10 09:05:01 localhost systemd[1]: Stopped target Switch Root.
Oct 10 09:05:01 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 10 09:05:01 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 10 09:05:01 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 10 09:05:01 localhost systemd[1]: Reached target Path Units.
Oct 10 09:05:01 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 10 09:05:01 localhost systemd[1]: Reached target Slice Units.
Oct 10 09:05:01 localhost systemd[1]: Reached target Swaps.
Oct 10 09:05:01 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 10 09:05:01 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 10 09:05:01 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 10 09:05:01 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 10 09:05:01 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 10 09:05:01 localhost systemd[1]: Listening on udev Control Socket.
Oct 10 09:05:01 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 10 09:05:01 localhost systemd[1]: Mounting Huge Pages File System...
Oct 10 09:05:01 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 10 09:05:01 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 10 09:05:01 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 10 09:05:01 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 10 09:05:01 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 10 09:05:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 10 09:05:01 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 10 09:05:01 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 10 09:05:01 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 10 09:05:01 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 10 09:05:01 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 10 09:05:01 localhost systemd[1]: Stopped Journal Service.
Oct 10 09:05:01 localhost systemd[1]: Starting Journal Service...
Oct 10 09:05:01 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 10 09:05:01 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 10 09:05:01 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 09:05:01 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 10 09:05:01 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 10 09:05:01 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 10 09:05:01 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 10 09:05:01 localhost kernel: fuse: init (API version 7.37)
Oct 10 09:05:01 localhost systemd-journald[678]: Journal started
Oct 10 09:05:01 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/a1727ec20198bc6caf436a6e13c4ff5e) is 8.0M, max 153.6M, 145.6M free.
Oct 10 09:05:00 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 10 09:05:00 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Mounted Huge Pages File System.
Oct 10 09:05:01 localhost systemd[1]: Started Journal Service.
Oct 10 09:05:01 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 10 09:05:01 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 10 09:05:01 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 10 09:05:01 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 10 09:05:01 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 10 09:05:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 10 09:05:01 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 10 09:05:01 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 10 09:05:01 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 10 09:05:01 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 10 09:05:01 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 10 09:05:01 localhost systemd[1]: Mounting FUSE Control File System...
Oct 10 09:05:01 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 10 09:05:01 localhost kernel: ACPI: bus type drm_connector registered
Oct 10 09:05:01 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 10 09:05:01 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 10 09:05:01 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 10 09:05:01 localhost systemd[1]: Starting Load/Save OS Random Seed...
Oct 10 09:05:01 localhost systemd[1]: Starting Create System Users...
Oct 10 09:05:01 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 10 09:05:01 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 10 09:05:01 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/a1727ec20198bc6caf436a6e13c4ff5e) is 8.0M, max 153.6M, 145.6M free.
Oct 10 09:05:01 localhost systemd-journald[678]: Received client request to flush runtime journal.
Oct 10 09:05:01 localhost systemd[1]: Mounted FUSE Control File System.
Oct 10 09:05:01 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 10 09:05:01 localhost systemd[1]: Finished Load/Save OS Random Seed.
Oct 10 09:05:01 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 10 09:05:01 localhost systemd[1]: Finished Create System Users.
Oct 10 09:05:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 10 09:05:01 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 10 09:05:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 10 09:05:01 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 10 09:05:01 localhost systemd[1]: Reached target Local File Systems.
Oct 10 09:05:01 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 10 09:05:01 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 10 09:05:01 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 10 09:05:01 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 10 09:05:01 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 10 09:05:01 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 10 09:05:01 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 10 09:05:01 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Oct 10 09:05:01 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 10 09:05:01 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 10 09:05:01 localhost systemd[1]: Starting Security Auditing Service...
Oct 10 09:05:01 localhost systemd[1]: Starting RPC Bind...
Oct 10 09:05:01 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 10 09:05:01 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 10 09:05:01 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 10 09:05:01 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 10 09:05:01 localhost systemd[1]: Started RPC Bind.
Oct 10 09:05:01 localhost augenrules[707]: /sbin/augenrules: No change
Oct 10 09:05:01 localhost augenrules[722]: No rules
Oct 10 09:05:01 localhost augenrules[722]: enabled 1
Oct 10 09:05:01 localhost augenrules[722]: failure 1
Oct 10 09:05:01 localhost augenrules[722]: pid 702
Oct 10 09:05:01 localhost augenrules[722]: rate_limit 0
Oct 10 09:05:01 localhost augenrules[722]: backlog_limit 8192
Oct 10 09:05:01 localhost augenrules[722]: lost 0
Oct 10 09:05:01 localhost augenrules[722]: backlog 1
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time 60000
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 10 09:05:01 localhost augenrules[722]: enabled 1
Oct 10 09:05:01 localhost augenrules[722]: failure 1
Oct 10 09:05:01 localhost augenrules[722]: pid 702
Oct 10 09:05:01 localhost augenrules[722]: rate_limit 0
Oct 10 09:05:01 localhost augenrules[722]: backlog_limit 8192
Oct 10 09:05:01 localhost augenrules[722]: lost 0
Oct 10 09:05:01 localhost augenrules[722]: backlog 2
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time 60000
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 10 09:05:01 localhost augenrules[722]: enabled 1
Oct 10 09:05:01 localhost augenrules[722]: failure 1
Oct 10 09:05:01 localhost augenrules[722]: pid 702
Oct 10 09:05:01 localhost augenrules[722]: rate_limit 0
Oct 10 09:05:01 localhost augenrules[722]: backlog_limit 8192
Oct 10 09:05:01 localhost augenrules[722]: lost 0
Oct 10 09:05:01 localhost augenrules[722]: backlog 4
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time 60000
Oct 10 09:05:01 localhost augenrules[722]: backlog_wait_time_actual 0
Oct 10 09:05:01 localhost systemd[1]: Started Security Auditing Service.
Oct 10 09:05:01 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 10 09:05:01 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 10 09:05:01 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 10 09:05:01 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 10 09:05:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 10 09:05:01 localhost systemd[1]: Starting Update is Completed...
Oct 10 09:05:01 localhost systemd[1]: Finished Update is Completed.
Oct 10 09:05:01 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Oct 10 09:05:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 10 09:05:01 localhost systemd[1]: Reached target System Initialization.
Oct 10 09:05:01 localhost systemd[1]: Started dnf makecache --timer.
Oct 10 09:05:01 localhost systemd[1]: Started Daily rotation of log files.
Oct 10 09:05:01 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 10 09:05:01 localhost systemd[1]: Reached target Timer Units.
Oct 10 09:05:01 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 10 09:05:01 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 10 09:05:01 localhost systemd[1]: Reached target Socket Units.
Oct 10 09:05:01 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 10 09:05:01 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 09:05:01 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 10 09:05:01 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 10 09:05:01 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 09:05:01 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 10 09:05:01 localhost systemd-udevd[740]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:05:01 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 10 09:05:01 localhost systemd[1]: Reached target Basic System.
Oct 10 09:05:01 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 10 09:05:01 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 10 09:05:01 localhost dbus-broker-lau[753]: Ready
Oct 10 09:05:01 localhost systemd[1]: Starting NTP client/server...
Oct 10 09:05:01 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 10 09:05:02 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 10 09:05:02 localhost chronyd[785]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 10 09:05:02 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 10 09:05:02 localhost chronyd[785]: Loaded 0 symmetric keys
Oct 10 09:05:02 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 10 09:05:02 localhost chronyd[785]: Using right/UTC timezone to obtain leap second data
Oct 10 09:05:02 localhost chronyd[785]: Loaded seccomp filter (level 2)
Oct 10 09:05:02 localhost systemd[1]: Starting IPv4 firewall with iptables...
Oct 10 09:05:02 localhost systemd[1]: Started irqbalance daemon.
Oct 10 09:05:02 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 10 09:05:02 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:05:02 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:05:02 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:05:02 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 10 09:05:02 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 10 09:05:02 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 10 09:05:02 localhost systemd[1]: Starting User Login Management...
Oct 10 09:05:02 localhost systemd[1]: Started NTP client/server.
Oct 10 09:05:02 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 10 09:05:02 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 10 09:05:02 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 10 09:05:02 localhost systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 10 09:05:02 localhost systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 10 09:05:02 localhost kernel: kvm_amd: TSC scaling supported
Oct 10 09:05:02 localhost kernel: kvm_amd: Nested Virtualization enabled
Oct 10 09:05:02 localhost kernel: kvm_amd: Nested Paging enabled
Oct 10 09:05:02 localhost kernel: kvm_amd: LBR virtualization supported
Oct 10 09:05:02 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 10 09:05:02 localhost kernel: Console: switching to colour dummy device 80x25
Oct 10 09:05:02 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 10 09:05:02 localhost kernel: [drm] features: -context_init
Oct 10 09:05:02 localhost systemd-logind[796]: New seat seat0.
Oct 10 09:05:02 localhost systemd[1]: Started User Login Management.
Oct 10 09:05:02 localhost kernel: [drm] number of scanouts: 1
Oct 10 09:05:02 localhost kernel: [drm] number of cap sets: 0
Oct 10 09:05:02 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 10 09:05:02 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 10 09:05:02 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 10 09:05:02 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 10 09:05:02 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 10 09:05:02 localhost iptables.init[788]: iptables: Applying firewall rules: [  OK  ]
Oct 10 09:05:02 localhost systemd[1]: Finished IPv4 firewall with iptables.
Oct 10 09:05:02 localhost cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 10 Oct 2025 09:05:02 +0000. Up 6.49 seconds.
Oct 10 09:05:03 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 10 09:05:03 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 10 09:05:03 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpzrdvt0mo.mount: Deactivated successfully.
Oct 10 09:05:03 localhost systemd[1]: Starting Hostname Service...
Oct 10 09:05:03 localhost systemd[1]: Started Hostname Service.
Oct 10 09:05:03 np0005479823.novalocal systemd-hostnamed[852]: Hostname set to <np0005479823.novalocal> (static)
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Reached target Preparation for Network.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Starting Network Manager...
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.5855] NetworkManager (version 1.54.1-1.el9) is starting... (boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.5861] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6040] manager[0x5624a3fad080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6099] hostname: hostname: using hostnamed
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6101] hostname: static hostname changed from (none) to "np0005479823.novalocal"
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6107] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6234] manager[0x5624a3fad080]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6235] manager[0x5624a3fad080]: rfkill: WWAN hardware radio set enabled
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6304] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6305] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6305] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6306] manager: Networking is enabled by state file
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6308] settings: Loaded settings plugin: keyfile (internal)
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6346] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6379] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6403] dhcp: init: Using DHCP client 'internal'
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6405] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6416] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6426] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6433] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6442] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6445] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6472] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6476] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6478] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6479] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6481] device (eth0): carrier: link connected
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6483] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6487] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6491] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6494] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6494] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6496] manager: NetworkManager state is now CONNECTING
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6496] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6501] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6503] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6544] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6550] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6565] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Started Network Manager.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Reached target Network.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6805] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6807] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6813] device (lo): Activation: successful, device activated.
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6818] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6819] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6821] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6823] device (eth0): Activation: successful, device activated.
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6827] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 09:05:03 np0005479823.novalocal NetworkManager[856]: <info>  [1760087103.6829] manager: startup complete
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Reached target NFS client services.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: Reached target Remote File Systems.
Oct 10 09:05:03 np0005479823.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 10 Oct 2025 09:05:04 +0000. Up 7.61 seconds.
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |  eth0  | True |         38.102.83.22         | 255.255.255.0 | global | fa:16:3e:a7:e1:7f |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fea7:e17f/64 |       .       |  link  | fa:16:3e:a7:e1:7f |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct 10 09:05:04 np0005479823.novalocal cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: new group: name=cloud-user, GID=1001
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: add 'cloud-user' to group 'adm'
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: add 'cloud-user' to group 'systemd-journal'
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: add 'cloud-user' to shadow group 'adm'
Oct 10 09:05:05 np0005479823.novalocal useradd[985]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Generating public/private rsa key pair.
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key fingerprint is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: SHA256:t/SQhZU1w5Y/d91ueZ7uPOfeowunYPDSSoyEHlJ//70 root@np0005479823.novalocal
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key's randomart image is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +---[RSA 3072]----+
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |            .++. |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |           o. +o |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |   .      . .. .o|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |  . o      o   .*|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: | . o o oS =    .=|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |  o o + =o +   .+|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |   . . + *.... oo|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |      . + o = .++|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |       .   o E**B|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +----[SHA256]-----+
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Generating public/private ecdsa key pair.
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key fingerprint is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: SHA256:99zX2Y02Xg+OcLpeocHbNt/2VxdcdEc959JD/kka7qU root@np0005479823.novalocal
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key's randomart image is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +---[ECDSA 256]---+
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |               o*|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |               o*|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |              +o+|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |         .   ..*o|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |        S + o +.=|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |         . B = +O|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |          + X O.O|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |           * E *+|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |         .+.. +.*|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +----[SHA256]-----+
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Generating public/private ed25519 key pair.
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key fingerprint is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: SHA256:An2o15wB8EXt32HDzImmdKZQcdQjbwVK+IdUItPhxRE root@np0005479823.novalocal
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: The key's randomart image is:
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +--[ED25519 256]--+
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |    ....o.+=*=E+ |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |     o +  +*+++ .|
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |    . + oo ooO + |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |     o +.oo B %  |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |    . o So B = o |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |     . .  o . .  |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |                 |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |                 |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: |                 |
Oct 10 09:05:05 np0005479823.novalocal cloud-init[919]: +----[SHA256]-----+
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Reached target Network is Online.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting System Logging Service...
Oct 10 09:05:05 np0005479823.novalocal sm-notify[1000]: Version 2.5.4 starting
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting Permit User Sessions...
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 10 09:05:05 np0005479823.novalocal sshd[1002]: Server listening on 0.0.0.0 port 22.
Oct 10 09:05:05 np0005479823.novalocal sshd[1002]: Server listening on :: port 22.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Finished Permit User Sessions.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started Command Scheduler.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started Getty on tty1.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Reached target Login Prompts.
Oct 10 09:05:05 np0005479823.novalocal crond[1004]: (CRON) STARTUP (1.5.7)
Oct 10 09:05:05 np0005479823.novalocal crond[1004]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 10 09:05:05 np0005479823.novalocal crond[1004]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 99% if used.)
Oct 10 09:05:05 np0005479823.novalocal crond[1004]: (CRON) INFO (running with inotify support)
Oct 10 09:05:05 np0005479823.novalocal rsyslogd[1001]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1001" x-info="https://www.rsyslog.com"] start
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Started System Logging Service.
Oct 10 09:05:05 np0005479823.novalocal rsyslogd[1001]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Reached target Multi-User System.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 10 09:05:05 np0005479823.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 10 09:05:06 np0005479823.novalocal rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1014]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 10 Oct 2025 09:05:06 +0000. Up 9.76 seconds.
Oct 10 09:05:06 np0005479823.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Oct 10 09:05:06 np0005479823.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1018]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 10 Oct 2025 09:05:06 +0000. Up 10.15 seconds.
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1021]: #############################################################
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1023]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1019]: Connection reset by 38.102.83.114 port 36392 [preauth]
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1026]: 256 SHA256:99zX2Y02Xg+OcLpeocHbNt/2VxdcdEc959JD/kka7qU root@np0005479823.novalocal (ECDSA)
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1025]: Unable to negotiate with 38.102.83.114 port 36398: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1030]: 256 SHA256:An2o15wB8EXt32HDzImmdKZQcdQjbwVK+IdUItPhxRE root@np0005479823.novalocal (ED25519)
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1033]: 3072 SHA256:t/SQhZU1w5Y/d91ueZ7uPOfeowunYPDSSoyEHlJ//70 root@np0005479823.novalocal (RSA)
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1034]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1036]: #############################################################
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1035]: Unable to negotiate with 38.102.83.114 port 36430: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1041]: Unable to negotiate with 38.102.83.114 port 36442: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 10 09:05:06 np0005479823.novalocal cloud-init[1018]: Cloud-init v. 24.4-7.el9 finished at Fri, 10 Oct 2025 09:05:06 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.36 seconds
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1043]: Connection reset by 38.102.83.114 port 36458 [preauth]
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1029]: Connection closed by 38.102.83.114 port 36414 [preauth]
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1047]: Unable to negotiate with 38.102.83.114 port 36472: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Oct 10 09:05:06 np0005479823.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Oct 10 09:05:06 np0005479823.novalocal systemd[1]: Reached target Cloud-init target.
Oct 10 09:05:06 np0005479823.novalocal systemd[1]: Startup finished in 1.498s (kernel) + 2.394s (initrd) + 6.528s (userspace) = 10.421s.
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1049]: Unable to negotiate with 38.102.83.114 port 36484: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 10 09:05:06 np0005479823.novalocal sshd-session[1045]: Connection closed by 38.102.83.114 port 36464 [preauth]
Oct 10 09:05:08 np0005479823.novalocal chronyd[785]: Selected source 45.61.49.156 (2.centos.pool.ntp.org)
Oct 10 09:05:08 np0005479823.novalocal chronyd[785]: System clock TAI offset set to 37 seconds
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 25 affinity is now unmanaged
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 31 affinity is now unmanaged
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 28 affinity is now unmanaged
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 32 affinity is now unmanaged
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 30 affinity is now unmanaged
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 10 09:05:12 np0005479823.novalocal irqbalance[790]: IRQ 29 affinity is now unmanaged
Oct 10 09:05:13 np0005479823.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:05:19 np0005479823.novalocal sshd-session[1051]: Accepted publickey for zuul from 38.102.83.114 port 45102 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 10 09:05:19 np0005479823.novalocal systemd-logind[796]: New session 1 of user zuul.
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Queued start job for default target Main User Target.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Created slice User Application Slice.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Reached target Paths.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Reached target Timers.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Starting D-Bus User Message Bus Socket...
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Starting Create User's Volatile Files and Directories...
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Finished Create User's Volatile Files and Directories.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Listening on D-Bus User Message Bus Socket.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Reached target Sockets.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Reached target Basic System.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Reached target Main User Target.
Oct 10 09:05:19 np0005479823.novalocal systemd[1055]: Startup finished in 127ms.
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 10 09:05:19 np0005479823.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 10 09:05:19 np0005479823.novalocal sshd-session[1051]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:05:20 np0005479823.novalocal python3[1138]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:05:25 np0005479823.novalocal python3[1166]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:05:32 np0005479823.novalocal python3[1224]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:05:33 np0005479823.novalocal python3[1264]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 10 09:05:33 np0005479823.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 09:05:35 np0005479823.novalocal python3[1292]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDEBkxJ4sw2+DK3cAbafLjRenK6XkRzPrF3EgUC0Qy/9kZ0kuErGkKyCEXRNE93NnKaUfoU9ebcJtP/W0B6xem+P337Yb5eT1d5d0DPlSyJ224O/rNncfiIo6YcMhrWXlb8yWwfHogZqjmOgJoH57cdsVMt26tUmFXzrJ1qEBloCvfoEe/tx8o3aeflIhUQ0zm2bbmhRn09oGRCODyyr02YoJZm5GbMiTb7mz8xvM31PEo8DzS5ti1YMOUi76ojLKIS6hZkIk4sUuSXmOwBoYhmyGjvs8csl/rxfVJq3bV+DFnatOKlFCyjgY0Ed4oCeReEGI6h29najM/8mUzfOeBj0dyWj3N3oOwlewtF5ifTB4JPwfEN1Rx37wbEzN/2Q7MOKzeWDxP2E0trD5ey9oqWFCpRpuJURMiPr+A6h070uR8U8vUNxGtH3vAmkuN+p3w79WF1wzlCmcoC+oSdwETcoOqkD84qkNgYJpVVpboSnwBo/H/aPJuJhs/nYPhz+c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:35 np0005479823.novalocal python3[1316]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:36 np0005479823.novalocal python3[1415]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:05:36 np0005479823.novalocal python3[1486]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087136.1893063-254-12633506510624/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=bea29065a9ff49468ede17c902a062ce_id_rsa follow=False checksum=6477c55dd7b29e382b0ff49c34043ebcd2bcc305 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:37 np0005479823.novalocal python3[1609]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:05:37 np0005479823.novalocal python3[1680]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087137.1955144-308-126061657745081/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=bea29065a9ff49468ede17c902a062ce_id_rsa.pub follow=False checksum=8b86d6c8317b3a249fa7c3a90607af8e51a186ef backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:39 np0005479823.novalocal python3[1728]: ansible-ping Invoked with data=pong
Oct 10 09:05:40 np0005479823.novalocal python3[1752]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:05:42 np0005479823.novalocal python3[1810]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 10 09:05:43 np0005479823.novalocal python3[1842]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:44 np0005479823.novalocal python3[1866]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:44 np0005479823.novalocal python3[1890]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:44 np0005479823.novalocal python3[1914]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:44 np0005479823.novalocal python3[1938]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:45 np0005479823.novalocal python3[1962]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:46 np0005479823.novalocal sudo[1986]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmlchztmlngdeqvkltcrqcnnaghgtdsj ; /usr/bin/python3'
Oct 10 09:05:46 np0005479823.novalocal sudo[1986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:05:47 np0005479823.novalocal python3[1988]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:47 np0005479823.novalocal sudo[1986]: pam_unix(sudo:session): session closed for user root
Oct 10 09:05:47 np0005479823.novalocal sudo[2064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hashaduosgvvmvzcryeftwtfbnvckqxj ; /usr/bin/python3'
Oct 10 09:05:47 np0005479823.novalocal sudo[2064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:05:47 np0005479823.novalocal python3[2066]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:05:47 np0005479823.novalocal sudo[2064]: pam_unix(sudo:session): session closed for user root
Oct 10 09:05:48 np0005479823.novalocal sudo[2137]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooyhifqypsjihykpanqkantwrtgzynxt ; /usr/bin/python3'
Oct 10 09:05:48 np0005479823.novalocal sudo[2137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:05:48 np0005479823.novalocal python3[2139]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087147.2492192-34-111049401545547/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:48 np0005479823.novalocal sudo[2137]: pam_unix(sudo:session): session closed for user root
Oct 10 09:05:48 np0005479823.novalocal python3[2187]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:49 np0005479823.novalocal python3[2211]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:49 np0005479823.novalocal python3[2235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:49 np0005479823.novalocal python3[2259]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:50 np0005479823.novalocal python3[2283]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:50 np0005479823.novalocal python3[2307]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:50 np0005479823.novalocal python3[2331]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:50 np0005479823.novalocal python3[2355]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:51 np0005479823.novalocal python3[2379]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:51 np0005479823.novalocal python3[2403]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:51 np0005479823.novalocal python3[2427]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:52 np0005479823.novalocal python3[2451]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:52 np0005479823.novalocal python3[2475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:52 np0005479823.novalocal python3[2499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:52 np0005479823.novalocal python3[2523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:53 np0005479823.novalocal python3[2547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:53 np0005479823.novalocal python3[2571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:53 np0005479823.novalocal python3[2595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:54 np0005479823.novalocal python3[2619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:54 np0005479823.novalocal python3[2643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:54 np0005479823.novalocal python3[2667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:54 np0005479823.novalocal python3[2691]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:55 np0005479823.novalocal python3[2715]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:55 np0005479823.novalocal python3[2739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:55 np0005479823.novalocal python3[2763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:56 np0005479823.novalocal python3[2787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:05:58 np0005479823.novalocal sudo[2811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugvpdfdnnkxrpnxwxikcyvdljspmgdvo ; /usr/bin/python3'
Oct 10 09:05:58 np0005479823.novalocal sudo[2811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:05:58 np0005479823.novalocal python3[2813]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 09:05:58 np0005479823.novalocal systemd[1]: Starting Time & Date Service...
Oct 10 09:05:58 np0005479823.novalocal systemd[1]: Started Time & Date Service.
Oct 10 09:05:58 np0005479823.novalocal systemd-timedated[2815]: Changed time zone to 'UTC' (UTC).
Oct 10 09:05:58 np0005479823.novalocal sudo[2811]: pam_unix(sudo:session): session closed for user root
Oct 10 09:05:58 np0005479823.novalocal sudo[2842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipxeaegkdkglusnhydymwdjbqrhhymfa ; /usr/bin/python3'
Oct 10 09:05:58 np0005479823.novalocal sudo[2842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:05:58 np0005479823.novalocal python3[2844]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:05:58 np0005479823.novalocal sudo[2842]: pam_unix(sudo:session): session closed for user root
Oct 10 09:05:59 np0005479823.novalocal python3[2920]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:05:59 np0005479823.novalocal python3[2991]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1760087159.1882753-254-206617524013399/source _original_basename=tmpzodar4gc follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:00 np0005479823.novalocal python3[3091]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:06:00 np0005479823.novalocal python3[3162]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760087160.0538177-304-208790470661808/source _original_basename=tmpc6zltfz1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:01 np0005479823.novalocal sudo[3262]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqhmnafrraoxowmpstkdtfgqofvrheue ; /usr/bin/python3'
Oct 10 09:06:01 np0005479823.novalocal sudo[3262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:01 np0005479823.novalocal python3[3264]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:06:01 np0005479823.novalocal sudo[3262]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:02 np0005479823.novalocal sudo[3335]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvwangfgfpohirfqjspocojmkmqdxsa ; /usr/bin/python3'
Oct 10 09:06:02 np0005479823.novalocal sudo[3335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:02 np0005479823.novalocal python3[3337]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760087161.3989956-384-198295954028806/source _original_basename=tmpsfb2cfi9 follow=False checksum=0a5264336eaf669ce906803fabc64043ef3757da backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:02 np0005479823.novalocal sudo[3335]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:02 np0005479823.novalocal python3[3385]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:06:03 np0005479823.novalocal python3[3411]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:06:03 np0005479823.novalocal sudo[3489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwsfcgwlyngryxoqpgumyxrmduvzkryj ; /usr/bin/python3'
Oct 10 09:06:03 np0005479823.novalocal sudo[3489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:03 np0005479823.novalocal python3[3491]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:06:03 np0005479823.novalocal sudo[3489]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:03 np0005479823.novalocal sudo[3562]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffjunnsydogqcvwlwsguqasmgpljbctl ; /usr/bin/python3'
Oct 10 09:06:03 np0005479823.novalocal sudo[3562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:03 np0005479823.novalocal python3[3564]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087163.2109654-454-274442428240427/source _original_basename=tmpg4pk5syj follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:03 np0005479823.novalocal sudo[3562]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:04 np0005479823.novalocal sudo[3613]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcytachvntgkfkeyleftofzskjkcyvjp ; /usr/bin/python3'
Oct 10 09:06:04 np0005479823.novalocal sudo[3613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:04 np0005479823.novalocal python3[3615]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-80e1-2ccb-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:06:04 np0005479823.novalocal sudo[3613]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:05 np0005479823.novalocal python3[3643]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-80e1-2ccb-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 10 09:06:06 np0005479823.novalocal python3[3671]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:23 np0005479823.novalocal sudo[3695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-churecebgiidwwfbtkzkqvoyrmosriby ; /usr/bin/python3'
Oct 10 09:06:23 np0005479823.novalocal sudo[3695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:06:24 np0005479823.novalocal python3[3697]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:06:24 np0005479823.novalocal sudo[3695]: pam_unix(sudo:session): session closed for user root
Oct 10 09:06:28 np0005479823.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 09:07:24 np0005479823.novalocal sshd-session[1065]: Received disconnect from 38.102.83.114 port 45102:11: disconnected by user
Oct 10 09:07:24 np0005479823.novalocal sshd-session[1065]: Disconnected from user zuul 38.102.83.114 port 45102
Oct 10 09:07:24 np0005479823.novalocal sshd-session[1051]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:07:24 np0005479823.novalocal systemd-logind[796]: Session 1 logged out. Waiting for processes to exit.
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 10 09:07:52 np0005479823.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 10 09:07:52 np0005479823.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5440] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 09:07:52 np0005479823.novalocal systemd-udevd[3700]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5639] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5667] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5670] device (eth1): carrier: link connected
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5671] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5675] policy: auto-activating connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5679] device (eth1): Activation: starting connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5679] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5681] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5686] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:07:52 np0005479823.novalocal NetworkManager[856]: <info>  [1760087272.5690] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:07:52 np0005479823.novalocal systemd[1055]: Starting Mark boot as successful...
Oct 10 09:07:52 np0005479823.novalocal systemd[1055]: Finished Mark boot as successful.
Oct 10 09:07:53 np0005479823.novalocal sshd-session[3705]: Accepted publickey for zuul from 38.102.83.114 port 43488 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:07:53 np0005479823.novalocal systemd-logind[796]: New session 3 of user zuul.
Oct 10 09:07:53 np0005479823.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 10 09:07:53 np0005479823.novalocal sshd-session[3705]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:07:53 np0005479823.novalocal python3[3732]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-dbf0-3472-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:08:03 np0005479823.novalocal sudo[3810]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcxhzfxbpbkqemgqtzczenahcjbsqtjo ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 10 09:08:03 np0005479823.novalocal sudo[3810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:08:03 np0005479823.novalocal python3[3812]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:08:04 np0005479823.novalocal sudo[3810]: pam_unix(sudo:session): session closed for user root
Oct 10 09:08:04 np0005479823.novalocal sudo[3883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nihhmrndmfcxhpcfbdtjvyciieravykb ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 10 09:08:04 np0005479823.novalocal sudo[3883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:08:04 np0005479823.novalocal python3[3885]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087283.6884868-206-60086575068802/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=f779573b6ceb38b51d12ffbc9edceedeba50f1e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:08:04 np0005479823.novalocal sudo[3883]: pam_unix(sudo:session): session closed for user root
Oct 10 09:08:04 np0005479823.novalocal sudo[3933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbswacupkigiodoepynxecugapxzguer ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 10 09:08:04 np0005479823.novalocal sudo[3933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:08:05 np0005479823.novalocal python3[3935]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Stopping Network Manager...
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0358] caught SIGTERM, shutting down normally.
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): canceled DHCP transaction
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): state changed no lease
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0373] manager: NetworkManager state is now CONNECTING
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0519] dhcp4 (eth1): canceled DHCP transaction
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0519] dhcp4 (eth1): state changed no lease
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[856]: <info>  [1760087285.0632] exiting (success)
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Stopped Network Manager.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: NetworkManager.service: Consumed 1.241s CPU time, 10.2M memory peak.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Starting Network Manager...
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.1220] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.1222] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.1279] manager[0x561a7f298070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Starting Hostname Service...
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Started Hostname Service.
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2356] hostname: hostname: using hostnamed
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2357] hostname: static hostname changed from (none) to "np0005479823.novalocal"
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2365] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2372] manager[0x561a7f298070]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2373] manager[0x561a7f298070]: rfkill: WWAN hardware radio set enabled
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2404] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2405] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2405] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2406] manager: Networking is enabled by state file
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2408] settings: Loaded settings plugin: keyfile (internal)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2412] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2438] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2446] dhcp: init: Using DHCP client 'internal'
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2449] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2454] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2460] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2469] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2477] device (eth0): carrier: link connected
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2482] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2489] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2489] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2496] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2502] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2509] device (eth1): carrier: link connected
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2513] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2519] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0) (indicated)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2519] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2525] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2532] device (eth1): Activation: starting connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Started Network Manager.
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2540] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2546] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2550] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2553] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2557] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2563] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2566] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2571] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2575] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2582] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2586] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2595] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2598] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2612] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2617] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2622] device (lo): Activation: successful, device activated.
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2637] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2643] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2720] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2744] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2745] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2750] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2754] device (eth0): Activation: successful, device activated.
Oct 10 09:08:05 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087285.2760] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 09:08:05 np0005479823.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 10 09:08:05 np0005479823.novalocal sudo[3933]: pam_unix(sudo:session): session closed for user root
Oct 10 09:08:05 np0005479823.novalocal python3[4019]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-dbf0-3472-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:08:15 np0005479823.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:08:35 np0005479823.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.3890] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 09:08:50 np0005479823.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:08:50 np0005479823.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4203] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4205] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4214] device (eth1): Activation: successful, device activated.
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4221] manager: startup complete
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4223] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <warn>  [1760087330.4228] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4236] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4358] dhcp4 (eth1): canceled DHCP transaction
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4359] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4359] dhcp4 (eth1): state changed no lease
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4371] policy: auto-activating connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4375] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4376] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4379] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4384] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4391] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4433] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4435] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:08:50 np0005479823.novalocal NetworkManager[3947]: <info>  [1760087330.4449] device (eth1): Activation: successful, device activated.
Oct 10 09:09:00 np0005479823.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:09:05 np0005479823.novalocal sshd-session[3708]: Received disconnect from 38.102.83.114 port 43488:11: disconnected by user
Oct 10 09:09:05 np0005479823.novalocal sshd-session[3708]: Disconnected from user zuul 38.102.83.114 port 43488
Oct 10 09:09:05 np0005479823.novalocal sshd-session[3705]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:09:05 np0005479823.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 10 09:09:05 np0005479823.novalocal systemd[1]: session-3.scope: Consumed 1.645s CPU time.
Oct 10 09:09:05 np0005479823.novalocal systemd-logind[796]: Session 3 logged out. Waiting for processes to exit.
Oct 10 09:09:05 np0005479823.novalocal systemd-logind[796]: Removed session 3.
Oct 10 09:09:19 np0005479823.novalocal sshd-session[4047]: Accepted publickey for zuul from 38.102.83.114 port 56164 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:09:19 np0005479823.novalocal systemd-logind[796]: New session 4 of user zuul.
Oct 10 09:09:19 np0005479823.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 10 09:09:19 np0005479823.novalocal sshd-session[4047]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:09:20 np0005479823.novalocal sudo[4126]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oicmhzwhwozikoecmqpzofqokrxnnsic ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 10 09:09:20 np0005479823.novalocal sudo[4126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:09:20 np0005479823.novalocal python3[4128]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:09:20 np0005479823.novalocal sudo[4126]: pam_unix(sudo:session): session closed for user root
Oct 10 09:09:20 np0005479823.novalocal sudo[4199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxjujxdanjdxkyitadvvdqqaeuzmvnda ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 10 09:09:20 np0005479823.novalocal sudo[4199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:09:20 np0005479823.novalocal python3[4201]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087359.9086897-373-209249118085449/source _original_basename=tmp2_e38k7a follow=False checksum=0edcb8668707f95c4678608a04fc39cdafb654ec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:09:20 np0005479823.novalocal sudo[4199]: pam_unix(sudo:session): session closed for user root
Oct 10 09:09:23 np0005479823.novalocal sshd-session[4050]: Connection closed by 38.102.83.114 port 56164
Oct 10 09:09:23 np0005479823.novalocal sshd-session[4047]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:09:23 np0005479823.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 10 09:09:23 np0005479823.novalocal systemd-logind[796]: Session 4 logged out. Waiting for processes to exit.
Oct 10 09:09:23 np0005479823.novalocal systemd-logind[796]: Removed session 4.
Oct 10 09:10:59 np0005479823.novalocal systemd[1055]: Created slice User Background Tasks Slice.
Oct 10 09:10:59 np0005479823.novalocal systemd[1055]: Starting Cleanup of User's Temporary Files and Directories...
Oct 10 09:10:59 np0005479823.novalocal systemd[1055]: Finished Cleanup of User's Temporary Files and Directories.
Oct 10 09:15:52 np0005479823.novalocal sshd-session[4230]: Accepted publickey for zuul from 38.102.83.114 port 57722 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:15:52 np0005479823.novalocal systemd-logind[796]: New session 5 of user zuul.
Oct 10 09:15:52 np0005479823.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 10 09:15:52 np0005479823.novalocal sshd-session[4230]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:15:52 np0005479823.novalocal sudo[4257]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fslxdzezmwukxxuggsnmryxrvcjdbygr ; /usr/bin/python3'
Oct 10 09:15:52 np0005479823.novalocal sudo[4257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:52 np0005479823.novalocal python3[4259]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-305a-504c-000000001cfe-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:52 np0005479823.novalocal sudo[4257]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:53 np0005479823.novalocal sudo[4285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbfshwhmhyszqtcutvyovqyolqmnkshv ; /usr/bin/python3'
Oct 10 09:15:53 np0005479823.novalocal sudo[4285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:53 np0005479823.novalocal python3[4287]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:15:53 np0005479823.novalocal sudo[4285]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:53 np0005479823.novalocal sudo[4312]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asufeovcwksepdjmkukymukwupxajiee ; /usr/bin/python3'
Oct 10 09:15:53 np0005479823.novalocal sudo[4312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:53 np0005479823.novalocal python3[4314]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:15:53 np0005479823.novalocal sudo[4312]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:53 np0005479823.novalocal sudo[4338]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shlqfmazwnvndajiqpnbwnzkomvjdtmz ; /usr/bin/python3'
Oct 10 09:15:53 np0005479823.novalocal sudo[4338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:53 np0005479823.novalocal python3[4340]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:15:53 np0005479823.novalocal sudo[4338]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:53 np0005479823.novalocal sudo[4364]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vradjgmesaqduswhmatdtfsiizmrfrok ; /usr/bin/python3'
Oct 10 09:15:53 np0005479823.novalocal sudo[4364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:54 np0005479823.novalocal python3[4366]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:15:54 np0005479823.novalocal sudo[4364]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:54 np0005479823.novalocal sudo[4390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwtyzvaqthcimvrjlfxnhmpycvlpofah ; /usr/bin/python3'
Oct 10 09:15:54 np0005479823.novalocal sudo[4390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:54 np0005479823.novalocal python3[4392]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:15:54 np0005479823.novalocal python3[4392]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 10 09:15:54 np0005479823.novalocal sudo[4390]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:55 np0005479823.novalocal sudo[4416]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcshadzxhrgnztttwghvtplleucemzqj ; /usr/bin/python3'
Oct 10 09:15:55 np0005479823.novalocal sudo[4416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:55 np0005479823.novalocal python3[4418]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 09:15:55 np0005479823.novalocal systemd[1]: Reloading.
Oct 10 09:15:55 np0005479823.novalocal systemd-rc-local-generator[4440]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:15:55 np0005479823.novalocal sudo[4416]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:56 np0005479823.novalocal sudo[4473]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnvoqlholwbdplsyrcefaqskfztykokr ; /usr/bin/python3'
Oct 10 09:15:56 np0005479823.novalocal sudo[4473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:57 np0005479823.novalocal python3[4475]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 10 09:15:57 np0005479823.novalocal sudo[4473]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:57 np0005479823.novalocal sudo[4499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krvvtarjepocbvwnqrrsoxizwsaispsc ; /usr/bin/python3'
Oct 10 09:15:57 np0005479823.novalocal sudo[4499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:57 np0005479823.novalocal python3[4501]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:57 np0005479823.novalocal sudo[4499]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:57 np0005479823.novalocal sudo[4527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksrnvbhzxaaztnzpuhifunsrothivcrr ; /usr/bin/python3'
Oct 10 09:15:57 np0005479823.novalocal sudo[4527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:57 np0005479823.novalocal python3[4529]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:57 np0005479823.novalocal sudo[4527]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:57 np0005479823.novalocal sudo[4555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwtcfmiuplptdxmrpuomjogcdmsxtxys ; /usr/bin/python3'
Oct 10 09:15:57 np0005479823.novalocal sudo[4555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:58 np0005479823.novalocal python3[4557]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:58 np0005479823.novalocal sudo[4555]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:58 np0005479823.novalocal sudo[4583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkjpshawkgnrbnyyimbbytynvjaiqrll ; /usr/bin/python3'
Oct 10 09:15:58 np0005479823.novalocal sudo[4583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:15:58 np0005479823.novalocal python3[4585]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:58 np0005479823.novalocal sudo[4583]: pam_unix(sudo:session): session closed for user root
Oct 10 09:15:59 np0005479823.novalocal python3[4613]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-305a-504c-000000001d04-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:15:59 np0005479823.novalocal python3[4643]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:16:02 np0005479823.novalocal sshd-session[4233]: Connection closed by 38.102.83.114 port 57722
Oct 10 09:16:02 np0005479823.novalocal sshd-session[4230]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:16:02 np0005479823.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 10 09:16:02 np0005479823.novalocal systemd[1]: session-5.scope: Consumed 3.552s CPU time.
Oct 10 09:16:02 np0005479823.novalocal systemd-logind[796]: Session 5 logged out. Waiting for processes to exit.
Oct 10 09:16:02 np0005479823.novalocal systemd-logind[796]: Removed session 5.
Oct 10 09:16:04 np0005479823.novalocal sshd-session[4648]: Accepted publickey for zuul from 38.102.83.114 port 44568 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:16:04 np0005479823.novalocal systemd-logind[796]: New session 6 of user zuul.
Oct 10 09:16:04 np0005479823.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 10 09:16:04 np0005479823.novalocal sshd-session[4648]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:16:04 np0005479823.novalocal sudo[4675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuijmhykwaambfnkfnkgfwqksvxpqqoj ; /usr/bin/python3'
Oct 10 09:16:04 np0005479823.novalocal sudo[4675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:16:04 np0005479823.novalocal python3[4677]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:16:37 np0005479823.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:16:47 np0005479823.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  Converting 363 SID table entries...
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:16:58 np0005479823.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:17:00 np0005479823.novalocal setsebool[4746]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 10 09:17:00 np0005479823.novalocal setsebool[4746]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  Converting 366 SID table entries...
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:17:13 np0005479823.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:17:34 np0005479823.novalocal dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 10 09:17:35 np0005479823.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:17:35 np0005479823.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:17:35 np0005479823.novalocal systemd[1]: Reloading.
Oct 10 09:17:35 np0005479823.novalocal systemd-rc-local-generator[5494]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:17:35 np0005479823.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:17:39 np0005479823.novalocal systemd[1]: Starting PackageKit Daemon...
Oct 10 09:17:39 np0005479823.novalocal PackageKit[7243]: daemon start
Oct 10 09:17:39 np0005479823.novalocal systemd[1]: Starting Authorization Manager...
Oct 10 09:17:39 np0005479823.novalocal polkitd[7343]: Started polkitd version 0.117
Oct 10 09:17:39 np0005479823.novalocal polkitd[7343]: Loading rules from directory /etc/polkit-1/rules.d
Oct 10 09:17:39 np0005479823.novalocal polkitd[7343]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 10 09:17:39 np0005479823.novalocal polkitd[7343]: Finished loading, compiling and executing 3 rules
Oct 10 09:17:39 np0005479823.novalocal polkitd[7343]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 10 09:17:39 np0005479823.novalocal systemd[1]: Started Authorization Manager.
Oct 10 09:17:39 np0005479823.novalocal systemd[1]: Started PackageKit Daemon.
Oct 10 09:17:40 np0005479823.novalocal sudo[4675]: pam_unix(sudo:session): session closed for user root
Oct 10 09:17:41 np0005479823.novalocal python3[8205]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-c8da-0a8f-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:17:42 np0005479823.novalocal kernel: evm: overlay not supported
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: Starting D-Bus User Message Bus...
Oct 10 09:17:42 np0005479823.novalocal dbus-broker-launch[9120]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 10 09:17:42 np0005479823.novalocal dbus-broker-launch[9120]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: Started D-Bus User Message Bus.
Oct 10 09:17:42 np0005479823.novalocal dbus-broker-lau[9120]: Ready
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: Created slice Slice /user.
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: podman-9015.scope: unit configures an IP firewall, but not running as root.
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: (This warning is only shown for the first unit using IP firewalling.)
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: Started podman-9015.scope.
Oct 10 09:17:42 np0005479823.novalocal systemd[1055]: Started podman-pause-c7e5e74d.scope.
Oct 10 09:17:43 np0005479823.novalocal sshd-session[4651]: Connection closed by 38.102.83.114 port 44568
Oct 10 09:17:43 np0005479823.novalocal sshd-session[4648]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:17:43 np0005479823.novalocal systemd-logind[796]: Session 6 logged out. Waiting for processes to exit.
Oct 10 09:17:43 np0005479823.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Oct 10 09:17:43 np0005479823.novalocal systemd[1]: session-6.scope: Consumed 1min 5.623s CPU time.
Oct 10 09:17:43 np0005479823.novalocal systemd-logind[796]: Removed session 6.
Oct 10 09:17:57 np0005479823.novalocal sshd-session[15845]: Unable to negotiate with 38.102.83.82 port 50428: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 10 09:17:57 np0005479823.novalocal sshd-session[15848]: Connection closed by 38.102.83.82 port 50420 [preauth]
Oct 10 09:17:57 np0005479823.novalocal sshd-session[15851]: Connection closed by 38.102.83.82 port 50408 [preauth]
Oct 10 09:17:57 np0005479823.novalocal sshd-session[15850]: Unable to negotiate with 38.102.83.82 port 50426: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 10 09:17:57 np0005479823.novalocal sshd-session[15853]: Unable to negotiate with 38.102.83.82 port 50436: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 10 09:18:01 np0005479823.novalocal sshd-session[17301]: Accepted publickey for zuul from 38.102.83.114 port 60440 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:18:01 np0005479823.novalocal systemd-logind[796]: New session 7 of user zuul.
Oct 10 09:18:01 np0005479823.novalocal systemd[1]: Started Session 7 of User zuul.
Oct 10 09:18:01 np0005479823.novalocal sshd-session[17301]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:18:01 np0005479823.novalocal python3[17408]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:18:01 np0005479823.novalocal sudo[17597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rduaflpnqvkjiwdnuvimrwrcwlbpteir ; /usr/bin/python3'
Oct 10 09:18:01 np0005479823.novalocal sudo[17597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:01 np0005479823.novalocal python3[17609]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:18:01 np0005479823.novalocal sudo[17597]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:02 np0005479823.novalocal sudo[17911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfuwsfnsiqwpjvgjbnssdsriaosgaexg ; /usr/bin/python3'
Oct 10 09:18:02 np0005479823.novalocal sudo[17911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:02 np0005479823.novalocal python3[17923]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005479823.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 10 09:18:02 np0005479823.novalocal useradd[18002]: new group: name=cloud-admin, GID=1002
Oct 10 09:18:02 np0005479823.novalocal useradd[18002]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Oct 10 09:18:02 np0005479823.novalocal sudo[17911]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:03 np0005479823.novalocal sudo[18168]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybxilzhunkdazwamqbhiknbatizwkccr ; /usr/bin/python3'
Oct 10 09:18:03 np0005479823.novalocal sudo[18168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:03 np0005479823.novalocal python3[18176]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 09:18:03 np0005479823.novalocal sudo[18168]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:03 np0005479823.novalocal sudo[18442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhfiywmdppmrbwphmxnbkvprkedaftsk ; /usr/bin/python3'
Oct 10 09:18:03 np0005479823.novalocal sudo[18442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:03 np0005479823.novalocal python3[18452]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:18:03 np0005479823.novalocal sudo[18442]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:04 np0005479823.novalocal sudo[18703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wialibtnxnlapfeofetwsboekayhdmrs ; /usr/bin/python3'
Oct 10 09:18:04 np0005479823.novalocal sudo[18703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:04 np0005479823.novalocal python3[18716]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087883.4802418-153-276830333064873/source _original_basename=tmpqu6aj1g9 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:18:04 np0005479823.novalocal sudo[18703]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:04 np0005479823.novalocal sudo[19056]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttqrdtuxcexqzdvhoqqbdvajuydudpow ; /usr/bin/python3'
Oct 10 09:18:04 np0005479823.novalocal sudo[19056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:18:05 np0005479823.novalocal python3[19066]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Oct 10 09:18:05 np0005479823.novalocal systemd[1]: Starting Hostname Service...
Oct 10 09:18:05 np0005479823.novalocal systemd[1]: Started Hostname Service.
Oct 10 09:18:05 np0005479823.novalocal systemd-hostnamed[19168]: Changed pretty hostname to 'compute-2'
Oct 10 09:18:05 compute-2 systemd-hostnamed[19168]: Hostname set to <compute-2> (static)
Oct 10 09:18:05 compute-2 NetworkManager[3947]: <info>  [1760087885.2535] hostname: static hostname changed from "np0005479823.novalocal" to "compute-2"
Oct 10 09:18:05 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:18:05 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:18:05 compute-2 sudo[19056]: pam_unix(sudo:session): session closed for user root
Oct 10 09:18:05 compute-2 sshd-session[17348]: Connection closed by 38.102.83.114 port 60440
Oct 10 09:18:05 compute-2 sshd-session[17301]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:18:05 compute-2 systemd[1]: session-7.scope: Deactivated successfully.
Oct 10 09:18:05 compute-2 systemd[1]: session-7.scope: Consumed 2.412s CPU time.
Oct 10 09:18:05 compute-2 systemd-logind[796]: Session 7 logged out. Waiting for processes to exit.
Oct 10 09:18:05 compute-2 systemd-logind[796]: Removed session 7.
Oct 10 09:18:12 compute-2 irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 10 09:18:12 compute-2 irqbalance[790]: IRQ 27 affinity is now unmanaged
Oct 10 09:18:15 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:18:34 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:18:34 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:18:34 compute-2 systemd[1]: man-db-cache-update.service: Consumed 55.078s CPU time.
Oct 10 09:18:34 compute-2 systemd[1]: run-r8859adfa7e2241e3b266999aa66918e0.service: Deactivated successfully.
Oct 10 09:18:35 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 09:19:59 compute-2 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 10 09:19:59 compute-2 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 10 09:19:59 compute-2 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 10 09:19:59 compute-2 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 10 09:21:41 compute-2 sshd-session[26529]: Accepted publickey for zuul from 38.102.83.82 port 59950 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:21:41 compute-2 systemd-logind[796]: New session 8 of user zuul.
Oct 10 09:21:41 compute-2 systemd[1]: Started Session 8 of User zuul.
Oct 10 09:21:41 compute-2 sshd-session[26529]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:21:41 compute-2 python3[26605]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:21:43 compute-2 sudo[26719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puvmoroxwasafowsrgkdzvpokeynjych ; /usr/bin/python3'
Oct 10 09:21:43 compute-2 sudo[26719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:43 compute-2 python3[26721]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:43 compute-2 sudo[26719]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:43 compute-2 sudo[26792]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ispgeywnuydkgucswrjzqlhndrxrsnfy ; /usr/bin/python3'
Oct 10 09:21:43 compute-2 sudo[26792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:44 compute-2 python3[26794]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean.repo follow=False checksum=c02c26d38f431b15f6463fc53c3d93ed5138ff07 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:44 compute-2 sudo[26792]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:44 compute-2 sudo[26818]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fccvurpyaanayikdvcybphcoodkqhset ; /usr/bin/python3'
Oct 10 09:21:44 compute-2 sudo[26818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:44 compute-2 python3[26820]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:44 compute-2 sudo[26818]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:44 compute-2 sudo[26891]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-papuioevaxgszqzhiwtdbkelwgcceqvs ; /usr/bin/python3'
Oct 10 09:21:44 compute-2 sudo[26891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:44 compute-2 python3[26893]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:44 compute-2 sudo[26891]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:44 compute-2 sudo[26917]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeobnmrvxoteeghsysogboalqkfwillk ; /usr/bin/python3'
Oct 10 09:21:44 compute-2 sudo[26917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:44 compute-2 python3[26919]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:44 compute-2 sudo[26917]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:45 compute-2 sudo[26990]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfadobgpozcaovqxgmwxbmquppybaqp ; /usr/bin/python3'
Oct 10 09:21:45 compute-2 sudo[26990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:45 compute-2 python3[26992]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:45 compute-2 sudo[26990]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:45 compute-2 sudo[27016]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmgupeuiptnpizvmqztmvabydtnridq ; /usr/bin/python3'
Oct 10 09:21:45 compute-2 sudo[27016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:45 compute-2 python3[27018]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:45 compute-2 sudo[27016]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:45 compute-2 sudo[27089]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zupoyqeapcvxziozgxazpdonwdzhdzql ; /usr/bin/python3'
Oct 10 09:21:45 compute-2 sudo[27089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:45 compute-2 python3[27091]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:45 compute-2 sudo[27089]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:45 compute-2 sudo[27115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwyafokqlycnnteavmrjerwwgpxdxthg ; /usr/bin/python3'
Oct 10 09:21:45 compute-2 sudo[27115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:46 compute-2 python3[27117]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:46 compute-2 sudo[27115]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:46 compute-2 sudo[27188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzxqklhidgasndhwnlvdpfswnzwapanp ; /usr/bin/python3'
Oct 10 09:21:46 compute-2 sudo[27188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:46 compute-2 python3[27190]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:46 compute-2 sudo[27188]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:46 compute-2 sudo[27214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-motgzmgjcrinrlaabfrwkteankgulwku ; /usr/bin/python3'
Oct 10 09:21:46 compute-2 sudo[27214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:46 compute-2 python3[27216]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:46 compute-2 sudo[27214]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:46 compute-2 sudo[27287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiqhgipogmokqgetsmjurmspjxqqvhoi ; /usr/bin/python3'
Oct 10 09:21:46 compute-2 sudo[27287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:47 compute-2 python3[27289]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:47 compute-2 sudo[27287]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:47 compute-2 sudo[27313]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwpdbiwswkepvxogpaykycfnnkqzqrhg ; /usr/bin/python3'
Oct 10 09:21:47 compute-2 sudo[27313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:47 compute-2 python3[27315]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:21:47 compute-2 sudo[27313]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:47 compute-2 sudo[27386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyvgnurcljjmmtfvwfkyucjicqxmeljx ; /usr/bin/python3'
Oct 10 09:21:47 compute-2 sudo[27386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:21:47 compute-2 python3[27388]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=75ca8f9fe9a538824fd094f239c30e8ce8652e8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:21:47 compute-2 sudo[27386]: pam_unix(sudo:session): session closed for user root
Oct 10 09:21:59 compute-2 python3[27436]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:22:44 compute-2 PackageKit[7243]: daemon quit
Oct 10 09:22:44 compute-2 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 09:26:59 compute-2 sshd-session[26532]: Received disconnect from 38.102.83.82 port 59950:11: disconnected by user
Oct 10 09:26:59 compute-2 sshd-session[26532]: Disconnected from user zuul 38.102.83.82 port 59950
Oct 10 09:26:59 compute-2 sshd-session[26529]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:26:59 compute-2 systemd[1]: session-8.scope: Deactivated successfully.
Oct 10 09:26:59 compute-2 systemd[1]: session-8.scope: Consumed 4.898s CPU time.
Oct 10 09:26:59 compute-2 systemd-logind[796]: Session 8 logged out. Waiting for processes to exit.
Oct 10 09:26:59 compute-2 systemd-logind[796]: Removed session 8.
Oct 10 09:33:02 compute-2 sshd-session[27443]: Connection closed by 223.72.5.55 port 43874
Oct 10 09:33:17 compute-2 sshd-session[27445]: Accepted publickey for zuul from 192.168.122.30 port 56168 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:33:17 compute-2 systemd-logind[796]: New session 9 of user zuul.
Oct 10 09:33:17 compute-2 systemd[1]: Started Session 9 of User zuul.
Oct 10 09:33:17 compute-2 sshd-session[27445]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:33:18 compute-2 python3.9[27598]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:33:20 compute-2 sudo[27777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwhxmncamnxapxkrlgotrdnpjarghmmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088799.6309214-59-193571166501641/AnsiballZ_command.py'
Oct 10 09:33:20 compute-2 sudo[27777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:20 compute-2 python3.9[27779]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:33:27 compute-2 sudo[27777]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:28 compute-2 sshd-session[27448]: Connection closed by 192.168.122.30 port 56168
Oct 10 09:33:28 compute-2 sshd-session[27445]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:33:28 compute-2 systemd[1]: session-9.scope: Deactivated successfully.
Oct 10 09:33:28 compute-2 systemd[1]: session-9.scope: Consumed 8.321s CPU time.
Oct 10 09:33:28 compute-2 systemd-logind[796]: Session 9 logged out. Waiting for processes to exit.
Oct 10 09:33:28 compute-2 systemd-logind[796]: Removed session 9.
Oct 10 09:33:43 compute-2 sshd-session[27836]: Accepted publickey for zuul from 192.168.122.30 port 39982 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:33:43 compute-2 systemd-logind[796]: New session 10 of user zuul.
Oct 10 09:33:43 compute-2 systemd[1]: Started Session 10 of User zuul.
Oct 10 09:33:43 compute-2 sshd-session[27836]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:33:44 compute-2 python3.9[27989]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 10 09:33:45 compute-2 python3.9[28163]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:33:46 compute-2 sudo[28313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjkljzagfgpijuzjwfasaqgllkndlsmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088826.0305622-95-253830468558238/AnsiballZ_command.py'
Oct 10 09:33:46 compute-2 sudo[28313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:46 compute-2 python3.9[28315]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:33:46 compute-2 sudo[28313]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:47 compute-2 sudo[28466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugrcehecumufvhxsuetskvclqapeulfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088827.1933606-131-205909046294109/AnsiballZ_stat.py'
Oct 10 09:33:47 compute-2 sudo[28466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:47 compute-2 python3.9[28468]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:33:47 compute-2 sudo[28466]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:48 compute-2 sudo[28618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmzosjbxdfsnknrhxdmswcrgikojeojg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088828.14815-155-13036541709313/AnsiballZ_file.py'
Oct 10 09:33:48 compute-2 sudo[28618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:48 compute-2 python3.9[28620]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:33:48 compute-2 sudo[28618]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:49 compute-2 sudo[28770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidnzursrvgthzkjcajfqxzgdffdgjoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088829.1501343-179-244385191920953/AnsiballZ_stat.py'
Oct 10 09:33:49 compute-2 sudo[28770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:49 compute-2 python3.9[28772]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:33:49 compute-2 sudo[28770]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:50 compute-2 sudo[28893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdqqeothvltbtniuvbqlahorpucafbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088829.1501343-179-244385191920953/AnsiballZ_copy.py'
Oct 10 09:33:50 compute-2 sudo[28893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:50 compute-2 python3.9[28895]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760088829.1501343-179-244385191920953/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:33:50 compute-2 sudo[28893]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:51 compute-2 sudo[29045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlzkhwpvlpeybwaenqyblkijhojasyeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088830.8879988-225-183931276798903/AnsiballZ_setup.py'
Oct 10 09:33:51 compute-2 sudo[29045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:51 compute-2 python3.9[29047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:33:51 compute-2 sudo[29045]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:52 compute-2 sudo[29201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lruxnqoujlpmothqmmcboigwcgcwnpgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088831.9952579-248-271604264566031/AnsiballZ_file.py'
Oct 10 09:33:52 compute-2 sudo[29201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:33:52 compute-2 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 10 09:33:52 compute-2 irqbalance[790]: IRQ 26 affinity is now unmanaged
Oct 10 09:33:52 compute-2 python3.9[29203]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:33:52 compute-2 sudo[29201]: pam_unix(sudo:session): session closed for user root
Oct 10 09:33:53 compute-2 python3.9[29353]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:34:00 compute-2 python3.9[29608]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:34:01 compute-2 python3.9[29758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:34:02 compute-2 python3.9[29912]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:34:03 compute-2 sudo[30068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlztvkijiwhgysktjberbczunmrxysiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088842.9289649-392-140169213736852/AnsiballZ_setup.py'
Oct 10 09:34:03 compute-2 sudo[30068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:34:03 compute-2 python3.9[30070]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:34:03 compute-2 sudo[30068]: pam_unix(sudo:session): session closed for user root
Oct 10 09:34:04 compute-2 sudo[30152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfagavgiovmnxmraavzyepjygkenctoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088842.9289649-392-140169213736852/AnsiballZ_dnf.py'
Oct 10 09:34:04 compute-2 sudo[30152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:34:04 compute-2 python3.9[30154]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:34:41 compute-2 sshd-session[27444]: Connection closed by authenticating user root 223.72.5.55 port 44684 [preauth]
Oct 10 09:34:48 compute-2 systemd[1]: Reloading.
Oct 10 09:34:48 compute-2 systemd-rc-local-generator[30351]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:34:48 compute-2 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 10 09:34:49 compute-2 systemd[1]: Reloading.
Oct 10 09:34:49 compute-2 systemd-rc-local-generator[30390]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:34:49 compute-2 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 10 09:34:49 compute-2 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 10 09:34:49 compute-2 systemd[1]: Reloading.
Oct 10 09:34:49 compute-2 systemd-rc-local-generator[30432]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:34:49 compute-2 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 10 09:34:49 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:34:49 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:34:49 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:35:55 compute-2 kernel: SELinux:  Converting 2713 SID table entries...
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:35:55 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:35:55 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 10 09:35:55 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:35:56 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:35:56 compute-2 systemd[1]: Reloading.
Oct 10 09:35:56 compute-2 systemd-rc-local-generator[30744]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:35:56 compute-2 systemd[1]: Starting dnf makecache...
Oct 10 09:35:56 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:35:56 compute-2 dnf[30762]: Failed determining last makecache time.
Oct 10 09:35:56 compute-2 systemd[1]: Starting PackageKit Daemon...
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-barbican-42b4c41831408a8e323 103 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 PackageKit[30964]: daemon start
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 173 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-cinder-1c00d6490d88e436f26ef 165 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 systemd[1]: Started PackageKit Daemon.
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-stevedore-c4acc5639fd2329372142 177 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-cloudkitty-tests-tempest-3961dc 174 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-diskimage-builder-43381184423c185801b5 103 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 157 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-designate-tests-tempest-347fdbc 163 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-glance-1fd12c29b339f30fe823e 172 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 174 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-manila-3c01b7181572c95dac462 182 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-vmware-nsxlib-458234972d1428ac9 170 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-octavia-ba397f07a7331190208c 184 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 sudo[30152]: pam_unix(sudo:session): session closed for user root
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-watcher-c014f81a8647287f6dcc 176 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-edpm-image-builder-55ba53cf215b14ed95b 139 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 158 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-swift-dc98a8463506ac520c469a 127 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-python-tempestconf-8515371b7cceebd4282 117 kB/s | 3.0 kB     00:00
Oct 10 09:35:56 compute-2 dnf[30762]: delorean-openstack-heat-ui-013accbfd179753bc3f0 128 kB/s | 3.0 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: CentOS Stream 9 - BaseOS                         24 kB/s | 6.7 kB     00:00
Oct 10 09:35:57 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:35:57 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:35:57 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.418s CPU time.
Oct 10 09:35:57 compute-2 systemd[1]: run-r7e7070006be94f99af409294bec20270.service: Deactivated successfully.
Oct 10 09:35:57 compute-2 dnf[30762]: CentOS Stream 9 - AppStream                      44 kB/s | 6.8 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: CentOS Stream 9 - CRB                            64 kB/s | 6.6 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: CentOS Stream 9 - Extras packages                74 kB/s | 8.0 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: dlrn-antelope-testing                           168 kB/s | 3.0 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: dlrn-antelope-build-deps                        181 kB/s | 3.0 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: centos9-rabbitmq                                 51 kB/s | 3.0 kB     00:00
Oct 10 09:35:57 compute-2 dnf[30762]: centos9-storage                                  77 kB/s | 3.0 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: centos9-opstools                                 78 kB/s | 3.0 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: NFV SIG OpenvSwitch                             106 kB/s | 3.0 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: repo-setup-centos-appstream                     210 kB/s | 4.4 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: repo-setup-centos-baseos                        149 kB/s | 3.9 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: repo-setup-centos-highavailability               38 kB/s | 3.9 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: repo-setup-centos-powertools                    157 kB/s | 4.3 kB     00:00
Oct 10 09:35:58 compute-2 dnf[30762]: Extra Packages for Enterprise Linux 9 - x86_64  106 kB/s |  25 kB     00:00
Oct 10 09:35:59 compute-2 dnf[30762]: Metadata cache created.
Oct 10 09:35:59 compute-2 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 10 09:35:59 compute-2 systemd[1]: Finished dnf makecache.
Oct 10 09:35:59 compute-2 systemd[1]: dnf-makecache.service: Consumed 1.945s CPU time.
Oct 10 09:36:01 compute-2 sudo[31703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqrhwctcuwpauxkogakgvvmchikciyij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088960.8068092-429-36663543009045/AnsiballZ_command.py'
Oct 10 09:36:01 compute-2 sudo[31703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:01 compute-2 python3.9[31705]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:02 compute-2 sudo[31703]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:03 compute-2 sudo[31984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axcgimadlwqzzprlpjdyfivndpffgnsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088962.7607846-453-66502398164454/AnsiballZ_selinux.py'
Oct 10 09:36:03 compute-2 sudo[31984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:03 compute-2 python3.9[31986]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 10 09:36:03 compute-2 sudo[31984]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:04 compute-2 sudo[32136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stdlhdqidfynyncstlwvaxsklfpmbxtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088964.3471878-486-119243099013589/AnsiballZ_command.py'
Oct 10 09:36:04 compute-2 sudo[32136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:04 compute-2 python3.9[32138]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 10 09:36:05 compute-2 sudo[32136]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:06 compute-2 sudo[32289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvuzjpbrrzpbdkatdllvmnrnenczguge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088966.0753527-509-266431319028816/AnsiballZ_file.py'
Oct 10 09:36:06 compute-2 sudo[32289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:08 compute-2 python3.9[32291]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:36:08 compute-2 sudo[32289]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:08 compute-2 sudo[32442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgetqycpstivczzivroateswpedhslut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088968.5182576-534-232622155453273/AnsiballZ_mount.py'
Oct 10 09:36:08 compute-2 sudo[32442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:09 compute-2 python3.9[32444]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 10 09:36:09 compute-2 sudo[32442]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:10 compute-2 sudo[32594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fayigddjmxbqabkpokikjbjiiauilnhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088970.1809375-618-56730444552814/AnsiballZ_file.py'
Oct 10 09:36:10 compute-2 sudo[32594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:10 compute-2 python3.9[32596]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:36:10 compute-2 sudo[32594]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:11 compute-2 sudo[32746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owsclljlrkmeyepxgxviombkvoqngosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088970.9191601-642-223959905449872/AnsiballZ_stat.py'
Oct 10 09:36:11 compute-2 sudo[32746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:16 compute-2 python3.9[32748]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:36:16 compute-2 sudo[32746]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:17 compute-2 sudo[32869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxsxekrlmcvevjgddpidsgkbtbjftwzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088970.9191601-642-223959905449872/AnsiballZ_copy.py'
Oct 10 09:36:17 compute-2 sudo[32869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:17 compute-2 python3.9[32871]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760088970.9191601-642-223959905449872/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:36:17 compute-2 sudo[32869]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:19 compute-2 sudo[33021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdnehefprnzmogiebhahwzhlouwwennx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088978.5042758-723-276760248385126/AnsiballZ_getent.py'
Oct 10 09:36:19 compute-2 sudo[33021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:19 compute-2 python3.9[33023]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 10 09:36:19 compute-2 sudo[33021]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:20 compute-2 sudo[33174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezbqgzxwpyfczrxitautofbtfckhzjsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088979.624986-747-14735116505941/AnsiballZ_group.py'
Oct 10 09:36:20 compute-2 sudo[33174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:20 compute-2 python3.9[33176]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 09:36:20 compute-2 groupadd[33177]: group added to /etc/group: name=qemu, GID=107
Oct 10 09:36:20 compute-2 groupadd[33177]: group added to /etc/gshadow: name=qemu
Oct 10 09:36:20 compute-2 groupadd[33177]: new group: name=qemu, GID=107
Oct 10 09:36:20 compute-2 sudo[33174]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:21 compute-2 sudo[33332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fousrtqwnjvtswiljqzmcvtnssnicohy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088980.729795-770-162336137963227/AnsiballZ_user.py'
Oct 10 09:36:21 compute-2 sudo[33332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:21 compute-2 python3.9[33334]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 09:36:21 compute-2 useradd[33336]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Oct 10 09:36:21 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:36:21 compute-2 sudo[33332]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:22 compute-2 sudo[33493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgdtosmswymttyhibqztcfhapaloqepx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088981.8170767-796-162568805533715/AnsiballZ_getent.py'
Oct 10 09:36:22 compute-2 sudo[33493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:22 compute-2 python3.9[33495]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 10 09:36:22 compute-2 sudo[33493]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:23 compute-2 sudo[33646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtuudzopphvbxxdiqfseofqovaikgvgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088982.82832-819-173833599346497/AnsiballZ_group.py'
Oct 10 09:36:23 compute-2 sudo[33646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:23 compute-2 python3.9[33648]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 09:36:23 compute-2 groupadd[33649]: group added to /etc/group: name=hugetlbfs, GID=42477
Oct 10 09:36:23 compute-2 groupadd[33649]: group added to /etc/gshadow: name=hugetlbfs
Oct 10 09:36:23 compute-2 groupadd[33649]: new group: name=hugetlbfs, GID=42477
Oct 10 09:36:23 compute-2 sudo[33646]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:24 compute-2 sudo[33804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlvnohbexmcdanszbtgqtytxxgfkleqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088983.8428066-846-254825724009708/AnsiballZ_file.py'
Oct 10 09:36:24 compute-2 sudo[33804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:24 compute-2 python3.9[33806]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 10 09:36:24 compute-2 sudo[33804]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:25 compute-2 sudo[33956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtajrfvksblvxyritctbqlbntikzbumm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088984.8806756-879-187288863530320/AnsiballZ_dnf.py'
Oct 10 09:36:25 compute-2 sudo[33956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:25 compute-2 python3.9[33958]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:36:27 compute-2 sudo[33956]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:27 compute-2 sudo[34109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxbomnwekwgyigefhoxawljberdnmxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088987.6226625-903-25842285286503/AnsiballZ_file.py'
Oct 10 09:36:27 compute-2 sudo[34109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:28 compute-2 python3.9[34111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:36:28 compute-2 sudo[34109]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:28 compute-2 sudo[34261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfzqbwfwjwprmrksdkhugtcucnjzycy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088988.3519216-927-115084464521278/AnsiballZ_stat.py'
Oct 10 09:36:28 compute-2 sudo[34261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:28 compute-2 python3.9[34263]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:36:28 compute-2 sudo[34261]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:29 compute-2 sudo[34384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqpcnwrcugseqhwfrdbvaeknzwtctxit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088988.3519216-927-115084464521278/AnsiballZ_copy.py'
Oct 10 09:36:29 compute-2 sudo[34384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:29 compute-2 python3.9[34386]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760088988.3519216-927-115084464521278/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:36:29 compute-2 sudo[34384]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:30 compute-2 sudo[34536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prvjswuchlbiakmnmcgfluwwoepfjgrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088989.7609081-972-48950695220032/AnsiballZ_systemd.py'
Oct 10 09:36:30 compute-2 sudo[34536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:30 compute-2 python3.9[34538]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:36:30 compute-2 systemd[1]: Starting Load Kernel Modules...
Oct 10 09:36:30 compute-2 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 10 09:36:30 compute-2 kernel: Bridge firewalling registered
Oct 10 09:36:30 compute-2 systemd-modules-load[34542]: Inserted module 'br_netfilter'
Oct 10 09:36:30 compute-2 systemd[1]: Finished Load Kernel Modules.
Oct 10 09:36:30 compute-2 sudo[34536]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:31 compute-2 sudo[34695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xevpsivfaewhwddpgbykxbluldpmfyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088991.0718472-996-21999562878414/AnsiballZ_stat.py'
Oct 10 09:36:31 compute-2 sudo[34695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:31 compute-2 python3.9[34697]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:36:31 compute-2 sudo[34695]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:31 compute-2 sudo[34818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmjkmmucfmdjybcdhxgfqssfbuasfgok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088991.0718472-996-21999562878414/AnsiballZ_copy.py'
Oct 10 09:36:31 compute-2 sudo[34818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:32 compute-2 python3.9[34820]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760088991.0718472-996-21999562878414/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:36:32 compute-2 sudo[34818]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:32 compute-2 sudo[34970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odbkcpqizxfidgamzfcjgkintqgydufb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760088992.6750503-1049-26216814082881/AnsiballZ_dnf.py'
Oct 10 09:36:32 compute-2 sudo[34970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:33 compute-2 python3.9[34972]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:36:36 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:36:36 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:36:36 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:36:36 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:36:36 compute-2 systemd[1]: Reloading.
Oct 10 09:36:37 compute-2 systemd-rc-local-generator[35031]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:36:37 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:36:37 compute-2 sudo[34970]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:38 compute-2 python3.9[36648]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:36:39 compute-2 python3.9[37588]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 10 09:36:40 compute-2 python3.9[38424]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:36:41 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:36:41 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:36:41 compute-2 systemd[1]: man-db-cache-update.service: Consumed 5.230s CPU time.
Oct 10 09:36:41 compute-2 systemd[1]: run-r05c4a1e4465e47d28f03753e58d0450b.service: Deactivated successfully.
Oct 10 09:36:41 compute-2 sudo[39143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgxqzfwncmeijdzvqaqceybrnxbumfok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089000.8983748-1167-252295090557759/AnsiballZ_command.py'
Oct 10 09:36:41 compute-2 sudo[39143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:41 compute-2 python3.9[39145]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:41 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 09:36:41 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 09:36:42 compute-2 sudo[39143]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:42 compute-2 sudo[39516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnzzvxfmjonoapmbsjlvsfxkimcktcsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089002.5136037-1194-74440847251765/AnsiballZ_systemd.py'
Oct 10 09:36:42 compute-2 sudo[39516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:43 compute-2 python3.9[39518]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:36:43 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 10 09:36:43 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Oct 10 09:36:43 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 10 09:36:43 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 09:36:43 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 09:36:43 compute-2 sudo[39516]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:44 compute-2 python3.9[39680]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 10 09:36:47 compute-2 sudo[39830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzsjmcuijpddxprcakyqocnnifrmnavx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089007.5428064-1365-95480749559087/AnsiballZ_systemd.py'
Oct 10 09:36:47 compute-2 sudo[39830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:48 compute-2 python3.9[39832]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:36:48 compute-2 systemd[1]: Reloading.
Oct 10 09:36:48 compute-2 systemd-rc-local-generator[39864]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:36:48 compute-2 sudo[39830]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:48 compute-2 sudo[40020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqofcunxvhosvngtkpmisopyzerdfyta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089008.6798308-1365-237453720359060/AnsiballZ_systemd.py'
Oct 10 09:36:48 compute-2 sudo[40020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:49 compute-2 python3.9[40022]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:36:49 compute-2 systemd[1]: Reloading.
Oct 10 09:36:49 compute-2 systemd-rc-local-generator[40052]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:36:49 compute-2 sudo[40020]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:50 compute-2 sudo[40209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsfmqudsvxaradowcwsnwipydayrecwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089010.0093026-1413-237217064848752/AnsiballZ_command.py'
Oct 10 09:36:50 compute-2 sudo[40209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:50 compute-2 python3.9[40211]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:50 compute-2 sudo[40209]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:51 compute-2 sudo[40362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjppsssjyydkybucgzdiwzhwtnnthzmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089010.8921704-1437-66164299246014/AnsiballZ_command.py'
Oct 10 09:36:51 compute-2 sudo[40362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:51 compute-2 python3.9[40364]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:51 compute-2 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 10 09:36:51 compute-2 sudo[40362]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:52 compute-2 sudo[40515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-todhrsznfhstmnycnkfwpisiomjosjiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089011.9654381-1461-11990475011346/AnsiballZ_command.py'
Oct 10 09:36:52 compute-2 sudo[40515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:52 compute-2 python3.9[40517]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:53 compute-2 sudo[40515]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:54 compute-2 sudo[40677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjryumwtksoczwpeuyktulsjodtixbas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089014.3297007-1484-39067326831737/AnsiballZ_command.py'
Oct 10 09:36:54 compute-2 sudo[40677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:54 compute-2 python3.9[40679]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:36:54 compute-2 sudo[40677]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:55 compute-2 sudo[40830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olsajnzaomizdjeggoggqpasfxpjwqai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089015.1161487-1509-231259241299775/AnsiballZ_systemd.py'
Oct 10 09:36:55 compute-2 sudo[40830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:36:55 compute-2 python3.9[40832]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:36:55 compute-2 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 10 09:36:55 compute-2 systemd[1]: Stopped Apply Kernel Variables.
Oct 10 09:36:55 compute-2 systemd[1]: Stopping Apply Kernel Variables...
Oct 10 09:36:55 compute-2 systemd[1]: Starting Apply Kernel Variables...
Oct 10 09:36:55 compute-2 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 10 09:36:55 compute-2 systemd[1]: Finished Apply Kernel Variables.
Oct 10 09:36:55 compute-2 sudo[40830]: pam_unix(sudo:session): session closed for user root
Oct 10 09:36:57 compute-2 sshd-session[27839]: Connection closed by 192.168.122.30 port 39982
Oct 10 09:36:57 compute-2 sshd-session[27836]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:36:57 compute-2 systemd[1]: session-10.scope: Deactivated successfully.
Oct 10 09:36:57 compute-2 systemd[1]: session-10.scope: Consumed 2min 18.721s CPU time.
Oct 10 09:36:57 compute-2 systemd-logind[796]: Session 10 logged out. Waiting for processes to exit.
Oct 10 09:36:57 compute-2 systemd-logind[796]: Removed session 10.
Oct 10 09:37:02 compute-2 sshd-session[40863]: Accepted publickey for zuul from 192.168.122.30 port 40112 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:37:02 compute-2 systemd-logind[796]: New session 11 of user zuul.
Oct 10 09:37:02 compute-2 systemd[1]: Started Session 11 of User zuul.
Oct 10 09:37:02 compute-2 sshd-session[40863]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:37:03 compute-2 python3.9[41016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:37:05 compute-2 sudo[41170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izrpjvxmytliejhdelcvldaqxlsrqaut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089024.5651302-70-16700258396762/AnsiballZ_getent.py'
Oct 10 09:37:05 compute-2 sudo[41170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:05 compute-2 python3.9[41172]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 10 09:37:05 compute-2 sudo[41170]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:05 compute-2 sudo[41323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkdsmqoziqqopxmcqlqqjuvdyslprrhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089025.4870722-94-119683005015829/AnsiballZ_group.py'
Oct 10 09:37:05 compute-2 sudo[41323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:06 compute-2 python3.9[41325]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 09:37:06 compute-2 groupadd[41326]: group added to /etc/group: name=openvswitch, GID=42476
Oct 10 09:37:06 compute-2 groupadd[41326]: group added to /etc/gshadow: name=openvswitch
Oct 10 09:37:06 compute-2 groupadd[41326]: new group: name=openvswitch, GID=42476
Oct 10 09:37:06 compute-2 sudo[41323]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:06 compute-2 sudo[41481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lemumqbsuiweqxnvpsxhjxoypkxoufys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089026.502405-118-76477863929633/AnsiballZ_user.py'
Oct 10 09:37:06 compute-2 sudo[41481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:07 compute-2 python3.9[41483]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 09:37:07 compute-2 useradd[41485]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Oct 10 09:37:07 compute-2 useradd[41485]: add 'openvswitch' to group 'hugetlbfs'
Oct 10 09:37:07 compute-2 useradd[41485]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 10 09:37:07 compute-2 sudo[41481]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:07 compute-2 sudo[41641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jydayqufglleuxysnvoeuxhzgobfrjqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089027.7041728-148-181949038941059/AnsiballZ_setup.py'
Oct 10 09:37:07 compute-2 sudo[41641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:08 compute-2 python3.9[41643]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:37:08 compute-2 sudo[41641]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:08 compute-2 sudo[41725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hglbhuotedadwbjfrdcoleaorpyftcna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089027.7041728-148-181949038941059/AnsiballZ_dnf.py'
Oct 10 09:37:08 compute-2 sudo[41725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:09 compute-2 python3.9[41727]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 09:37:11 compute-2 sudo[41725]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:12 compute-2 sudo[41889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymlionjtixmahvaaslugukjdjwvdnmcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089031.7311609-190-72945169097281/AnsiballZ_dnf.py'
Oct 10 09:37:12 compute-2 sudo[41889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:12 compute-2 python3.9[41891]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:37:25 compute-2 kernel: SELinux:  Converting 2724 SID table entries...
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:37:25 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:37:25 compute-2 groupadd[41914]: group added to /etc/group: name=unbound, GID=993
Oct 10 09:37:25 compute-2 groupadd[41914]: group added to /etc/gshadow: name=unbound
Oct 10 09:37:25 compute-2 groupadd[41914]: new group: name=unbound, GID=993
Oct 10 09:37:25 compute-2 useradd[41921]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Oct 10 09:37:25 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 10 09:37:25 compute-2 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 10 09:37:27 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:37:27 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:37:27 compute-2 systemd[1]: Reloading.
Oct 10 09:37:27 compute-2 systemd-rc-local-generator[42417]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:37:27 compute-2 systemd-sysv-generator[42421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:37:27 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:37:27 compute-2 sudo[41889]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:28 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:37:28 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:37:28 compute-2 systemd[1]: run-r43b98f29bd7b4209b060fbf0717fdefb.service: Deactivated successfully.
Oct 10 09:37:28 compute-2 sudo[42990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtjlqpkxzivcpgmsvnkgwenplyjykuoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089048.1312075-214-245062683200864/AnsiballZ_systemd.py'
Oct 10 09:37:28 compute-2 sudo[42990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:29 compute-2 python3.9[42992]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:37:29 compute-2 systemd[1]: Reloading.
Oct 10 09:37:29 compute-2 systemd-sysv-generator[43026]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:37:29 compute-2 systemd-rc-local-generator[43023]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:37:29 compute-2 systemd[1]: Starting Open vSwitch Database Unit...
Oct 10 09:37:29 compute-2 chown[43033]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 10 09:37:29 compute-2 ovs-ctl[43038]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 10 09:37:29 compute-2 ovs-ctl[43038]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 10 09:37:29 compute-2 ovs-ctl[43038]: Starting ovsdb-server [  OK  ]
Oct 10 09:37:29 compute-2 ovs-vsctl[43087]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 10 09:37:29 compute-2 ovs-vsctl[43107]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"49146ebb-575d-4bd4-816c-0b242fb944ee\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 10 09:37:29 compute-2 ovs-ctl[43038]: Configuring Open vSwitch system IDs [  OK  ]
Oct 10 09:37:29 compute-2 ovs-vsctl[43113]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct 10 09:37:29 compute-2 ovs-ctl[43038]: Enabling remote OVSDB managers [  OK  ]
Oct 10 09:37:29 compute-2 systemd[1]: Started Open vSwitch Database Unit.
Oct 10 09:37:29 compute-2 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 10 09:37:29 compute-2 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 10 09:37:29 compute-2 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 10 09:37:29 compute-2 kernel: openvswitch: Open vSwitch switching datapath
Oct 10 09:37:29 compute-2 ovs-ctl[43159]: Inserting openvswitch module [  OK  ]
Oct 10 09:37:30 compute-2 ovs-ctl[43127]: Starting ovs-vswitchd [  OK  ]
Oct 10 09:37:30 compute-2 ovs-vsctl[43178]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct 10 09:37:30 compute-2 ovs-ctl[43127]: Enabling remote OVSDB managers [  OK  ]
Oct 10 09:37:30 compute-2 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 10 09:37:30 compute-2 systemd[1]: Starting Open vSwitch...
Oct 10 09:37:30 compute-2 systemd[1]: Finished Open vSwitch.
Oct 10 09:37:30 compute-2 sudo[42990]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:31 compute-2 python3.9[43329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:37:32 compute-2 sudo[43479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khaeqbfxahsjveewhgsgyivmulkhcrdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089051.607626-268-113351007804740/AnsiballZ_sefcontext.py'
Oct 10 09:37:32 compute-2 sudo[43479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:32 compute-2 python3.9[43481]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 10 09:37:33 compute-2 kernel: SELinux:  Converting 2738 SID table entries...
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:37:33 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:37:34 compute-2 sudo[43479]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:35 compute-2 python3.9[43636]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:37:35 compute-2 sudo[43792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yscdzhgqgskxhvellwlatcoemedswhgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089055.4484942-322-120897047397832/AnsiballZ_dnf.py'
Oct 10 09:37:35 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 10 09:37:35 compute-2 sudo[43792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:36 compute-2 python3.9[43794]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:37:37 compute-2 sudo[43792]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:38 compute-2 sudo[43945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwxeoxuwrgfghuvxbvlphswcyxxnexqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089057.485836-346-16270426427175/AnsiballZ_command.py'
Oct 10 09:37:38 compute-2 sudo[43945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:38 compute-2 python3.9[43947]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:37:38 compute-2 sudo[43945]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:39 compute-2 sudo[44232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kspdedepzrjmtzjwurjujfxikkzxxsmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089059.1743417-371-230586045154270/AnsiballZ_file.py'
Oct 10 09:37:39 compute-2 sudo[44232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:39 compute-2 python3.9[44234]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 09:37:39 compute-2 sudo[44232]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:40 compute-2 python3.9[44384]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:37:41 compute-2 sudo[44536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtrsnvhbkcxxppzkeojvqqycqkhyjsot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089061.0603077-418-135527131536486/AnsiballZ_dnf.py'
Oct 10 09:37:41 compute-2 sudo[44536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:41 compute-2 python3.9[44538]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:37:43 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:37:43 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:37:43 compute-2 systemd[1]: Reloading.
Oct 10 09:37:43 compute-2 systemd-rc-local-generator[44576]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:37:43 compute-2 systemd-sysv-generator[44579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:37:43 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:37:43 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:37:43 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:37:43 compute-2 systemd[1]: run-r53a3a03dafc243e28330f71e5c2c08d8.service: Deactivated successfully.
Oct 10 09:37:44 compute-2 sudo[44536]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:44 compute-2 sudo[44853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlysxpvcgmpoyctguvcyftsnjdfqirku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089064.201948-442-189553541294812/AnsiballZ_systemd.py'
Oct 10 09:37:44 compute-2 sudo[44853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:44 compute-2 python3.9[44855]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:37:44 compute-2 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 10 09:37:44 compute-2 systemd[1]: Stopped Network Manager Wait Online.
Oct 10 09:37:44 compute-2 systemd[1]: Stopping Network Manager Wait Online...
Oct 10 09:37:44 compute-2 systemd[1]: Stopping Network Manager...
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8229] caught SIGTERM, shutting down normally.
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8241] dhcp4 (eth0): canceled DHCP transaction
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8242] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8242] dhcp4 (eth0): state changed no lease
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8244] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 09:37:44 compute-2 NetworkManager[3947]: <info>  [1760089064.8317] exiting (success)
Oct 10 09:37:44 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:37:44 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:37:44 compute-2 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 10 09:37:44 compute-2 systemd[1]: Stopped Network Manager.
Oct 10 09:37:44 compute-2 systemd[1]: NetworkManager.service: Consumed 9.978s CPU time, 4.1M memory peak, read 0B from disk, written 15.5K to disk.
Oct 10 09:37:44 compute-2 systemd[1]: Starting Network Manager...
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.8975] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.8977] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9031] manager[0x56502bf26090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 09:37:44 compute-2 systemd[1]: Starting Hostname Service...
Oct 10 09:37:44 compute-2 systemd[1]: Started Hostname Service.
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9802] hostname: hostname: using hostnamed
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9802] hostname: static hostname changed from (none) to "compute-2"
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9808] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9813] manager[0x56502bf26090]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9813] manager[0x56502bf26090]: rfkill: WWAN hardware radio set enabled
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9841] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9851] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9852] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9853] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9854] manager: Networking is enabled by state file
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9856] settings: Loaded settings plugin: keyfile (internal)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9861] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9896] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9908] dhcp: init: Using DHCP client 'internal'
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9912] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9920] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9926] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9935] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9944] device (eth0): carrier: link connected
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9949] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9954] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9955] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9962] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9971] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9980] device (eth1): carrier: link connected
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9986] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9994] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8) (indicated)
Oct 10 09:37:44 compute-2 NetworkManager[44866]: <info>  [1760089064.9995] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0004] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0015] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 09:37:45 compute-2 systemd[1]: Started Network Manager.
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0027] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0321] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0324] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0327] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0330] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0332] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0334] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0335] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0341] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0347] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0350] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0373] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0383] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0391] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0392] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0397] device (lo): Activation: successful, device activated.
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0403] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0409] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 09:37:45 compute-2 systemd[1]: Starting Network Manager Wait Online...
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0480] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0487] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0494] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0498] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0502] device (eth1): Activation: successful, device activated.
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0512] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0513] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0517] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0522] device (eth0): Activation: successful, device activated.
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0526] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 09:37:45 compute-2 NetworkManager[44866]: <info>  [1760089065.0557] manager: startup complete
Oct 10 09:37:45 compute-2 sudo[44853]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:45 compute-2 systemd[1]: Finished Network Manager Wait Online.
Oct 10 09:37:45 compute-2 sudo[45080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfutiptwdcumewpjirmayfhtwrlnhgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089065.324918-466-61479126688034/AnsiballZ_dnf.py'
Oct 10 09:37:45 compute-2 sudo[45080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:45 compute-2 python3.9[45082]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:37:51 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:37:51 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:37:51 compute-2 systemd[1]: Reloading.
Oct 10 09:37:51 compute-2 systemd-rc-local-generator[45135]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:37:51 compute-2 systemd-sysv-generator[45141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:37:51 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:37:52 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:37:52 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:37:52 compute-2 systemd[1]: run-raa7ab797ccac4f57a8f8a7c82554e885.service: Deactivated successfully.
Oct 10 09:37:52 compute-2 sudo[45080]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:55 compute-2 sudo[45542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwuqiiurwknznqlqvtfvmdxitwnwpalc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089074.709404-502-258016171552538/AnsiballZ_stat.py'
Oct 10 09:37:55 compute-2 sudo[45542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:55 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:37:55 compute-2 python3.9[45544]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:37:55 compute-2 sudo[45542]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:56 compute-2 sudo[45694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqsnfbmhvcjkfuudfrpdfbuqtpijsnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089075.5311685-529-245236730998978/AnsiballZ_ini_file.py'
Oct 10 09:37:56 compute-2 sudo[45694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:56 compute-2 python3.9[45696]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:37:56 compute-2 sudo[45694]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:57 compute-2 sudo[45848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxnczsekeasecrgdkmlvgaqvjvkqjdcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089076.745706-559-280443934050460/AnsiballZ_ini_file.py'
Oct 10 09:37:57 compute-2 sudo[45848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:57 compute-2 python3.9[45850]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:37:57 compute-2 sudo[45848]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:57 compute-2 sudo[46000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdgrammzrvjcdfwqzhuprmutxnpbwald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089077.4343088-559-279587539951267/AnsiballZ_ini_file.py'
Oct 10 09:37:57 compute-2 sudo[46000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:57 compute-2 python3.9[46002]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:37:57 compute-2 sudo[46000]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:58 compute-2 sudo[46152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myajlwdgvrejjfzzwcycfqrbotctuqhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089078.2808037-604-25728940612088/AnsiballZ_ini_file.py'
Oct 10 09:37:58 compute-2 sudo[46152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:58 compute-2 python3.9[46154]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:37:58 compute-2 sudo[46152]: pam_unix(sudo:session): session closed for user root
Oct 10 09:37:59 compute-2 sudo[46304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymjyajulberrsspluazpqxxokkkwpoxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089078.9889326-604-202949812186549/AnsiballZ_ini_file.py'
Oct 10 09:37:59 compute-2 sudo[46304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:37:59 compute-2 python3.9[46306]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:37:59 compute-2 sudo[46304]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:00 compute-2 sudo[46456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqieaglbmjoenerxlcolqaicxcbaroiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089079.8330784-650-90231972118996/AnsiballZ_stat.py'
Oct 10 09:38:00 compute-2 sudo[46456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:00 compute-2 python3.9[46458]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:38:00 compute-2 sudo[46456]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:01 compute-2 sudo[46579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymazntftanrcfvgbyfznbbrbuomhaksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089079.8330784-650-90231972118996/AnsiballZ_copy.py'
Oct 10 09:38:01 compute-2 sudo[46579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:01 compute-2 python3.9[46581]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089079.8330784-650-90231972118996/.source _original_basename=.v6b9jex6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:01 compute-2 sudo[46579]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:01 compute-2 sudo[46731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueplmgxcigxznxzwtmrkrrmnpmxogpti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089081.474195-694-72575033090190/AnsiballZ_file.py'
Oct 10 09:38:01 compute-2 sudo[46731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:01 compute-2 python3.9[46733]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:02 compute-2 sudo[46731]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:02 compute-2 sudo[46883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmxolmprtsobgodstzlovhuypkjyezxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089082.2603378-718-169854136679687/AnsiballZ_edpm_os_net_config_mappings.py'
Oct 10 09:38:02 compute-2 sudo[46883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:02 compute-2 python3.9[46885]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 10 09:38:02 compute-2 sudo[46883]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:03 compute-2 sudo[47035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxdmzqeliicopzmdmeymaatcpsqsyqgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089083.2484837-745-249239499694383/AnsiballZ_file.py'
Oct 10 09:38:03 compute-2 sudo[47035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:03 compute-2 python3.9[47037]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:03 compute-2 sudo[47035]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:04 compute-2 sudo[47187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjirfsikxdkhybmgvlwzylkcyigszcwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089084.104634-775-184730443891128/AnsiballZ_stat.py'
Oct 10 09:38:04 compute-2 sudo[47187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:04 compute-2 sudo[47187]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:05 compute-2 sudo[47310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzomvmoerzjahliwwqahmhqncmeqnbqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089084.104634-775-184730443891128/AnsiballZ_copy.py'
Oct 10 09:38:05 compute-2 sudo[47310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:05 compute-2 sudo[47310]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:06 compute-2 sudo[47462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqdzcruairlhakwfzsupttbtbvovkebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089085.677273-821-32167668296630/AnsiballZ_slurp.py'
Oct 10 09:38:06 compute-2 sudo[47462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:06 compute-2 python3.9[47464]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 10 09:38:06 compute-2 sudo[47462]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:07 compute-2 sudo[47637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuwuqonrbtsqihrnawqjvfbdmqsaldoc ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089086.6110146-847-87432343928043/async_wrapper.py j919480864945 300 /home/zuul/.ansible/tmp/ansible-tmp-1760089086.6110146-847-87432343928043/AnsiballZ_edpm_os_net_config.py _'
Oct 10 09:38:07 compute-2 sudo[47637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:07 compute-2 ansible-async_wrapper.py[47639]: Invoked with j919480864945 300 /home/zuul/.ansible/tmp/ansible-tmp-1760089086.6110146-847-87432343928043/AnsiballZ_edpm_os_net_config.py _
Oct 10 09:38:07 compute-2 ansible-async_wrapper.py[47642]: Starting module and watcher
Oct 10 09:38:07 compute-2 ansible-async_wrapper.py[47642]: Start watching 47643 (300)
Oct 10 09:38:07 compute-2 ansible-async_wrapper.py[47643]: Start module (47643)
Oct 10 09:38:07 compute-2 ansible-async_wrapper.py[47639]: Return async_wrapper task started.
Oct 10 09:38:07 compute-2 sudo[47637]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:07 compute-2 python3.9[47644]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 10 09:38:08 compute-2 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 10 09:38:08 compute-2 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 10 09:38:08 compute-2 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 10 09:38:08 compute-2 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 10 09:38:08 compute-2 kernel: cfg80211: failed to load regulatory.db
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4239] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4256] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4752] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4754] audit: op="connection-add" uuid="a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a" name="br-ex-br" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4770] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4771] audit: op="connection-add" uuid="6ffc8a82-12bb-4304-b569-c4c1a4a906ee" name="br-ex-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4785] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4787] audit: op="connection-add" uuid="1dc9e23e-eafb-472b-bc4f-5974c5384b39" name="eth1-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4800] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4801] audit: op="connection-add" uuid="bb9ab156-ca23-43ce-9fed-d3863964f080" name="vlan20-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4816] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4818] audit: op="connection-add" uuid="1d7d8008-311f-4f5a-9aba-3e7542f88a0c" name="vlan21-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4830] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4831] audit: op="connection-add" uuid="6221bcee-a358-45d6-92e9-7684dff6685a" name="vlan22-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4844] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4845] audit: op="connection-add" uuid="5173518e-a62a-4bd6-b50d-9a0011562c36" name="vlan23-port" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4865] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4879] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4880] audit: op="connection-add" uuid="c0226397-8fe0-490f-96fc-b4257f699165" name="br-ex-if" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4933] audit: op="connection-update" uuid="97070329-66da-5289-8aaa-712e43fb35a8" name="ci-private-network" args="ipv4.routing-rules,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,connection.controller,connection.port-type,connection.slave-type,connection.master,connection.timestamp,ipv6.routing-rules,ipv6.addresses,ipv6.method,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ovs-external-ids.data,ovs-interface.type" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4945] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4946] audit: op="connection-add" uuid="6198a381-3767-409f-ba8e-52460604a9a6" name="vlan20-if" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4959] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4960] audit: op="connection-add" uuid="592c8809-2f1b-409d-9bb5-155d2c80c0a5" name="vlan21-if" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4974] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4975] audit: op="connection-add" uuid="a4a3f96a-e8f4-4c43-aa05-97308511e250" name="vlan22-if" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4989] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.4990] audit: op="connection-add" uuid="abd24fa1-cceb-41ce-a984-04a6bd98b8cb" name="vlan23-if" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5000] audit: op="connection-delete" uuid="9070ab9c-fab6-3aab-b68b-48035af180d0" name="Wired connection 1" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5010] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5019] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5022] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5022] audit: op="connection-activate" uuid="a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a" name="br-ex-br" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5023] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5028] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5031] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (6ffc8a82-12bb-4304-b569-c4c1a4a906ee)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5032] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5036] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5039] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1dc9e23e-eafb-472b-bc4f-5974c5384b39)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5040] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5044] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5047] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (bb9ab156-ca23-43ce-9fed-d3863964f080)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5049] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5053] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5056] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (1d7d8008-311f-4f5a-9aba-3e7542f88a0c)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5057] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5061] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5064] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6221bcee-a358-45d6-92e9-7684dff6685a)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5065] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5070] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5073] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (5173518e-a62a-4bd6-b50d-9a0011562c36)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5073] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5075] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5076] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5080] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5084] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5087] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c0226397-8fe0-490f-96fc-b4257f699165)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5087] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5089] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5090] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5091] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5092] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5101] device (eth1): disconnecting for new activation request.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5101] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5103] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5104] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5105] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5107] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5110] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5112] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (6198a381-3767-409f-ba8e-52460604a9a6)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5113] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5115] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5116] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5116] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5118] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5121] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5124] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (592c8809-2f1b-409d-9bb5-155d2c80c0a5)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5124] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5126] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5127] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5128] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5129] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5132] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5135] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (a4a3f96a-e8f4-4c43-aa05-97308511e250)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5136] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5137] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5139] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5139] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5141] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5144] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5150] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (abd24fa1-cceb-41ce-a984-04a6bd98b8cb)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5151] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5153] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5154] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5155] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5156] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5166] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5168] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5170] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5171] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5176] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5180] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5183] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5185] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5186] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5190] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5193] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 kernel: ovs-system: entered promiscuous mode
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5195] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5196] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5200] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5202] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5204] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5206] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5209] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5211] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5213] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5215] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 kernel: Timeout policy base is empty
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5220] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5226] dhcp4 (eth0): canceled DHCP transaction
Oct 10 09:38:09 compute-2 systemd-udevd[47650]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5228] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5228] dhcp4 (eth0): state changed no lease
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5230] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5241] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5244] audit: op="device-reapply" interface="eth1" ifindex=3 pid=47645 uid=0 result="fail" reason="Device is not activated"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5249] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5283] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5285] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5291] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 10 09:38:09 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5329] device (eth1): disconnecting for new activation request.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5330] audit: op="connection-activate" uuid="97070329-66da-5289-8aaa-712e43fb35a8" name="ci-private-network" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5331] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5335] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5424] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5428] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5442] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5446] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5453] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5457] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5462] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5464] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5466] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5468] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5470] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5472] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5474] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5477] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5483] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5489] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5493] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5497] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5501] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5505] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5509] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5513] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5517] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5521] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5525] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5529] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5534] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5537] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 kernel: br-ex: entered promiscuous mode
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5581] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5583] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5589] device (eth1): Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5681] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5694] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 kernel: vlan22: entered promiscuous mode
Oct 10 09:38:09 compute-2 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5724] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5725] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5731] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 kernel: vlan23: entered promiscuous mode
Oct 10 09:38:09 compute-2 systemd-udevd[47649]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5853] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5867] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5884] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5885] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5892] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 kernel: vlan20: entered promiscuous mode
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5932] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5947] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5962] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5963] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.5970] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 kernel: vlan21: entered promiscuous mode
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6031] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6044] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6059] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6060] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6066] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6097] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6110] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6123] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6124] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 09:38:09 compute-2 NetworkManager[44866]: <info>  [1760089089.6130] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 09:38:10 compute-2 NetworkManager[44866]: <info>  [1760089090.7288] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 09:38:10 compute-2 NetworkManager[44866]: <info>  [1760089090.9024] checkpoint[0x56502befc950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 10 09:38:10 compute-2 NetworkManager[44866]: <info>  [1760089090.9025] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 sudo[48002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqqtmplrrxujyjsxvkylujipqemvuhos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089090.7020025-847-273887782213220/AnsiballZ_async_status.py'
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.1450] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 sudo[48002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.1463] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.3273] audit: op="networking-control" arg="global-dns-configuration" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.3301] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.3329] audit: op="networking-control" arg="global-dns-configuration" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.3346] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 python3.9[48004]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=status _async_dir=/root/.ansible_async
Oct 10 09:38:11 compute-2 sudo[48002]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.5134] checkpoint[0x56502befca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 10 09:38:11 compute-2 NetworkManager[44866]: <info>  [1760089091.5138] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 09:38:11 compute-2 ansible-async_wrapper.py[47643]: Module complete (47643)
Oct 10 09:38:12 compute-2 ansible-async_wrapper.py[47642]: Done in kid B.
Oct 10 09:38:14 compute-2 sudo[48106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsmzhsnogyrgdljxidssvzujufusmfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089090.7020025-847-273887782213220/AnsiballZ_async_status.py'
Oct 10 09:38:14 compute-2 sudo[48106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:14 compute-2 python3.9[48108]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=status _async_dir=/root/.ansible_async
Oct 10 09:38:14 compute-2 sudo[48106]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:14 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 09:38:15 compute-2 sudo[48208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntkkahxcrdwjdtthvsirissrsehljtdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089090.7020025-847-273887782213220/AnsiballZ_async_status.py'
Oct 10 09:38:15 compute-2 sudo[48208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:15 compute-2 python3.9[48210]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=cleanup _async_dir=/root/.ansible_async
Oct 10 09:38:15 compute-2 sudo[48208]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:16 compute-2 sudo[48360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuoxyvvctwbgcpbkonxaakitxzunayog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089095.8322594-928-59179713041241/AnsiballZ_stat.py'
Oct 10 09:38:16 compute-2 sudo[48360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:16 compute-2 python3.9[48362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:38:16 compute-2 sudo[48360]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:16 compute-2 sudo[48483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wonnaejshgvdatarqxmwxyanpihhsmpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089095.8322594-928-59179713041241/AnsiballZ_copy.py'
Oct 10 09:38:16 compute-2 sudo[48483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:16 compute-2 python3.9[48485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089095.8322594-928-59179713041241/.source.returncode _original_basename=.x1t8il0z follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:16 compute-2 sudo[48483]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:17 compute-2 sudo[48635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moslmkbdiwyprjsrnkqdsjvjwqvqhnyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089097.23963-976-280840722254279/AnsiballZ_stat.py'
Oct 10 09:38:17 compute-2 sudo[48635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:17 compute-2 python3.9[48637]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:38:17 compute-2 sudo[48635]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:18 compute-2 sudo[48759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkalfirrsshthmzufvvgrhpcjxbdrdww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089097.23963-976-280840722254279/AnsiballZ_copy.py'
Oct 10 09:38:18 compute-2 sudo[48759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:18 compute-2 python3.9[48761]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089097.23963-976-280840722254279/.source.cfg _original_basename=.0ujjls6y follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:18 compute-2 sudo[48759]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:19 compute-2 sudo[48911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiynotxluvjhyeuyyronzuuqcxityocg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089098.8174126-1021-71172580986938/AnsiballZ_systemd.py'
Oct 10 09:38:19 compute-2 sudo[48911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:19 compute-2 python3.9[48913]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:38:19 compute-2 systemd[1]: Reloading Network Manager...
Oct 10 09:38:19 compute-2 NetworkManager[44866]: <info>  [1760089099.5072] audit: op="reload" arg="0" pid=48917 uid=0 result="success"
Oct 10 09:38:19 compute-2 NetworkManager[44866]: <info>  [1760089099.5078] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 10 09:38:19 compute-2 systemd[1]: Reloaded Network Manager.
Oct 10 09:38:19 compute-2 sudo[48911]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:19 compute-2 sshd-session[40866]: Connection closed by 192.168.122.30 port 40112
Oct 10 09:38:20 compute-2 sshd-session[40863]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:38:20 compute-2 systemd[1]: session-11.scope: Deactivated successfully.
Oct 10 09:38:20 compute-2 systemd[1]: session-11.scope: Consumed 53.606s CPU time.
Oct 10 09:38:20 compute-2 systemd-logind[796]: Session 11 logged out. Waiting for processes to exit.
Oct 10 09:38:20 compute-2 systemd-logind[796]: Removed session 11.
Oct 10 09:38:25 compute-2 sshd-session[48948]: Accepted publickey for zuul from 192.168.122.30 port 59486 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:38:25 compute-2 systemd-logind[796]: New session 12 of user zuul.
Oct 10 09:38:25 compute-2 systemd[1]: Started Session 12 of User zuul.
Oct 10 09:38:25 compute-2 sshd-session[48948]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:38:26 compute-2 python3.9[49101]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:38:27 compute-2 python3.9[49255]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:38:29 compute-2 python3.9[49449]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:38:29 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 09:38:29 compute-2 sshd-session[48951]: Connection closed by 192.168.122.30 port 59486
Oct 10 09:38:29 compute-2 sshd-session[48948]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:38:29 compute-2 systemd[1]: session-12.scope: Deactivated successfully.
Oct 10 09:38:29 compute-2 systemd[1]: session-12.scope: Consumed 2.384s CPU time.
Oct 10 09:38:29 compute-2 systemd-logind[796]: Session 12 logged out. Waiting for processes to exit.
Oct 10 09:38:29 compute-2 systemd-logind[796]: Removed session 12.
Oct 10 09:38:35 compute-2 sshd-session[49478]: Accepted publickey for zuul from 192.168.122.30 port 32984 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:38:35 compute-2 systemd-logind[796]: New session 13 of user zuul.
Oct 10 09:38:35 compute-2 systemd[1]: Started Session 13 of User zuul.
Oct 10 09:38:35 compute-2 sshd-session[49478]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:38:36 compute-2 python3.9[49631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:38:37 compute-2 python3.9[49785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:38:38 compute-2 sudo[49940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frcxencugwokzmajhwcwsozfbfkvodni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089117.80456-82-272756883645333/AnsiballZ_setup.py'
Oct 10 09:38:38 compute-2 sudo[49940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:38 compute-2 python3.9[49942]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:38:38 compute-2 sudo[49940]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:39 compute-2 sudo[50024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkyjrfjjtgtlfwmdwhuxagplkltzutsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089117.80456-82-272756883645333/AnsiballZ_dnf.py'
Oct 10 09:38:39 compute-2 sudo[50024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:39 compute-2 python3.9[50026]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:38:40 compute-2 sudo[50024]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:41 compute-2 sudo[50178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcjayigzdqmjcafyualyluqohiulwwey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089120.7541394-118-225553689408574/AnsiballZ_setup.py'
Oct 10 09:38:41 compute-2 sudo[50178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:41 compute-2 python3.9[50180]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:38:41 compute-2 sudo[50178]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:42 compute-2 sudo[50373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qccdodarmekwszvhearvvofzmwmfczvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089122.2510698-151-106116999432157/AnsiballZ_file.py'
Oct 10 09:38:42 compute-2 sudo[50373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:42 compute-2 python3.9[50375]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:42 compute-2 sudo[50373]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:43 compute-2 sudo[50525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyqnzsrcjrvkobfwiokdpohdjdgydqjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089123.2249508-175-19260611183723/AnsiballZ_command.py'
Oct 10 09:38:43 compute-2 sudo[50525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:43 compute-2 python3.9[50527]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:38:43 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat3112466313-merged.mount: Deactivated successfully.
Oct 10 09:38:43 compute-2 podman[50528]: 2025-10-10 09:38:43.934936336 +0000 UTC m=+0.067274571 system refresh
Oct 10 09:38:43 compute-2 sudo[50525]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:44 compute-2 sudo[50688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgtyfriukcfsipmniajfxdcrnskkwotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089124.321155-199-136222184130636/AnsiballZ_stat.py'
Oct 10 09:38:44 compute-2 sudo[50688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:44 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:38:45 compute-2 python3.9[50690]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:38:45 compute-2 sudo[50688]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:45 compute-2 sudo[50811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srrzanmnuotlcjcsqubhkudzgthcjlfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089124.321155-199-136222184130636/AnsiballZ_copy.py'
Oct 10 09:38:45 compute-2 sudo[50811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:45 compute-2 python3.9[50813]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089124.321155-199-136222184130636/.source.json follow=False _original_basename=podman_network_config.j2 checksum=bc749cbd8dd097470751e86f47dabc032c51f5ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:38:45 compute-2 sudo[50811]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:46 compute-2 sudo[50963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbmwhpviotvxvdpmrkzfymacigtzktkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089126.0124109-245-101609462851517/AnsiballZ_stat.py'
Oct 10 09:38:46 compute-2 sudo[50963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:46 compute-2 python3.9[50965]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:38:46 compute-2 sudo[50963]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:46 compute-2 sudo[51086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luoffpgvflpklfbxilgqytehuribcupe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089126.0124109-245-101609462851517/AnsiballZ_copy.py'
Oct 10 09:38:46 compute-2 sudo[51086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:47 compute-2 python3.9[51088]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089126.0124109-245-101609462851517/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:38:47 compute-2 sudo[51086]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:47 compute-2 sudo[51238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnmxviuhvrndmbyjmedghlplmeqqvugz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089127.4657912-292-32597033903033/AnsiballZ_ini_file.py'
Oct 10 09:38:47 compute-2 sudo[51238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:48 compute-2 python3.9[51240]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:38:48 compute-2 sudo[51238]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:48 compute-2 sudo[51390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqikmxbyqvxwrwewfaasiwlgmeioofme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089128.2958565-292-233953080648357/AnsiballZ_ini_file.py'
Oct 10 09:38:48 compute-2 sudo[51390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:48 compute-2 python3.9[51392]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:38:48 compute-2 sudo[51390]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:49 compute-2 sudo[51542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgjeoexfdwnuisokgspyhqgipqrqmoyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089128.933453-292-230872236548701/AnsiballZ_ini_file.py'
Oct 10 09:38:49 compute-2 sudo[51542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:49 compute-2 python3.9[51544]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:38:49 compute-2 sudo[51542]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:49 compute-2 sudo[51694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlhvfcimfookdvnwdcyhhwfljrskutay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089129.6744666-292-91481553513425/AnsiballZ_ini_file.py'
Oct 10 09:38:49 compute-2 sudo[51694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:50 compute-2 python3.9[51696]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:38:50 compute-2 sudo[51694]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:51 compute-2 sudo[51846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tultidfeyyxsqarqoephlxsegbmzfyzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089130.8225148-385-223857150199398/AnsiballZ_dnf.py'
Oct 10 09:38:51 compute-2 sudo[51846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:51 compute-2 python3.9[51848]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:38:52 compute-2 sudo[51846]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:53 compute-2 sudo[51999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unlgryvmonnzlpaemrgjtdnfkyzkjqyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089133.2477534-418-109273091835103/AnsiballZ_setup.py'
Oct 10 09:38:53 compute-2 sudo[51999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:53 compute-2 python3.9[52001]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:38:53 compute-2 sudo[51999]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:54 compute-2 sudo[52153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dueqbmozyqrruitcxpinuxmoawpylpeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089134.2108707-442-256616055832809/AnsiballZ_stat.py'
Oct 10 09:38:54 compute-2 sudo[52153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:54 compute-2 python3.9[52155]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:38:54 compute-2 sudo[52153]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:55 compute-2 sudo[52305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isazhcrskcpmxvvchunawewjpnugjeox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089135.1006248-469-236343721017345/AnsiballZ_stat.py'
Oct 10 09:38:55 compute-2 sudo[52305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:55 compute-2 python3.9[52307]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:38:55 compute-2 sudo[52305]: pam_unix(sudo:session): session closed for user root
Oct 10 09:38:56 compute-2 sudo[52457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmplvnyqfikopsfynyhupmuyjemtszkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089136.0281727-499-182652769654401/AnsiballZ_service_facts.py'
Oct 10 09:38:56 compute-2 sudo[52457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:38:56 compute-2 python3.9[52459]: ansible-service_facts Invoked
Oct 10 09:38:56 compute-2 network[52476]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:38:56 compute-2 network[52477]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:38:56 compute-2 network[52478]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:38:59 compute-2 sudo[52457]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:01 compute-2 sudo[52763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyxmeciokcpjkekyilnposnyksblgaag ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1760089141.2000673-538-208825254006809/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1760089141.2000673-538-208825254006809/args'
Oct 10 09:39:01 compute-2 sudo[52763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:01 compute-2 sudo[52763]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:02 compute-2 sudo[52930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyuejhmlzabmxvdozbhfbuojetpsryzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089142.1495795-571-250357367295695/AnsiballZ_dnf.py'
Oct 10 09:39:02 compute-2 sudo[52930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:02 compute-2 python3.9[52932]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:39:03 compute-2 sudo[52930]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:05 compute-2 sudo[53083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezjuzgzybixsdvrdacxhtaqhmbqnozvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089144.485547-610-58778221139444/AnsiballZ_package_facts.py'
Oct 10 09:39:05 compute-2 sudo[53083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:05 compute-2 python3.9[53085]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 10 09:39:05 compute-2 sudo[53083]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:07 compute-2 sudo[53235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cryrhcckblksaencfrmcoidgtbbfaqef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089146.6440942-641-59427780666903/AnsiballZ_stat.py'
Oct 10 09:39:07 compute-2 sudo[53235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:07 compute-2 python3.9[53237]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:07 compute-2 sudo[53235]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:07 compute-2 sudo[53360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfnpqizmlibfrteegppqhyiaeggoxdlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089146.6440942-641-59427780666903/AnsiballZ_copy.py'
Oct 10 09:39:07 compute-2 sudo[53360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:07 compute-2 python3.9[53362]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089146.6440942-641-59427780666903/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:07 compute-2 sudo[53360]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:08 compute-2 sudo[53514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djzcxzolrdafcgreyjzuevudoikfwsvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089148.2733202-687-247464561094948/AnsiballZ_stat.py'
Oct 10 09:39:08 compute-2 sudo[53514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:08 compute-2 python3.9[53516]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:08 compute-2 sudo[53514]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:09 compute-2 sudo[53639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-affamombodomxucepptidihxorrlthcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089148.2733202-687-247464561094948/AnsiballZ_copy.py'
Oct 10 09:39:09 compute-2 sudo[53639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:09 compute-2 python3.9[53641]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089148.2733202-687-247464561094948/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:09 compute-2 sudo[53639]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:10 compute-2 sudo[53793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apzbsepkoeitnuwoxjvljmkplpmodfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089150.46534-750-242845250206925/AnsiballZ_lineinfile.py'
Oct 10 09:39:10 compute-2 sudo[53793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:11 compute-2 python3.9[53795]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:11 compute-2 sudo[53793]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:12 compute-2 sudo[53947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxldzdytoapxbxnrvxlejnzdvkpvvgpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089152.2087085-795-72279905137232/AnsiballZ_setup.py'
Oct 10 09:39:12 compute-2 sudo[53947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:12 compute-2 python3.9[53949]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:39:12 compute-2 sudo[53947]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:13 compute-2 sudo[54031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnorwuekvnevezpugiqvpjkyguweflre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089152.2087085-795-72279905137232/AnsiballZ_systemd.py'
Oct 10 09:39:13 compute-2 sudo[54031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:14 compute-2 python3.9[54033]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:39:15 compute-2 sudo[54031]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:16 compute-2 sudo[54185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueqoqavyhgdwzihxjalrwmkhqqqfelty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089156.0364683-843-186305467853304/AnsiballZ_setup.py'
Oct 10 09:39:16 compute-2 sudo[54185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:16 compute-2 python3.9[54187]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:39:16 compute-2 sudo[54185]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:17 compute-2 sudo[54269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aayxudlmggwmckajziesgtyzjrgfrjda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089156.0364683-843-186305467853304/AnsiballZ_systemd.py'
Oct 10 09:39:17 compute-2 sudo[54269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:17 compute-2 python3.9[54271]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:39:17 compute-2 chronyd[785]: chronyd exiting
Oct 10 09:39:17 compute-2 systemd[1]: Stopping NTP client/server...
Oct 10 09:39:17 compute-2 systemd[1]: chronyd.service: Deactivated successfully.
Oct 10 09:39:17 compute-2 systemd[1]: Stopped NTP client/server.
Oct 10 09:39:17 compute-2 systemd[1]: Starting NTP client/server...
Oct 10 09:39:17 compute-2 chronyd[54279]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 10 09:39:17 compute-2 chronyd[54279]: Frequency -32.237 +/- 0.239 ppm read from /var/lib/chrony/drift
Oct 10 09:39:17 compute-2 chronyd[54279]: Loaded seccomp filter (level 2)
Oct 10 09:39:17 compute-2 systemd[1]: Started NTP client/server.
Oct 10 09:39:17 compute-2 sudo[54269]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:18 compute-2 sshd-session[49481]: Connection closed by 192.168.122.30 port 32984
Oct 10 09:39:18 compute-2 sshd-session[49478]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:39:18 compute-2 systemd[1]: session-13.scope: Deactivated successfully.
Oct 10 09:39:18 compute-2 systemd[1]: session-13.scope: Consumed 26.664s CPU time.
Oct 10 09:39:18 compute-2 systemd-logind[796]: Session 13 logged out. Waiting for processes to exit.
Oct 10 09:39:18 compute-2 systemd-logind[796]: Removed session 13.
Oct 10 09:39:24 compute-2 sshd-session[54305]: Accepted publickey for zuul from 192.168.122.30 port 51964 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:39:24 compute-2 systemd-logind[796]: New session 14 of user zuul.
Oct 10 09:39:24 compute-2 systemd[1]: Started Session 14 of User zuul.
Oct 10 09:39:24 compute-2 sshd-session[54305]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:39:24 compute-2 sudo[54458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sssgceqqqboygwslfwqiflewnmqjxjbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089164.1922965-28-160542294617642/AnsiballZ_file.py'
Oct 10 09:39:24 compute-2 sudo[54458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:24 compute-2 python3.9[54460]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:24 compute-2 sudo[54458]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:25 compute-2 sudo[54610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmlxuozwzuyfjufbkgrttopadmgcrqdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089165.1210947-64-152240516221048/AnsiballZ_stat.py'
Oct 10 09:39:25 compute-2 sudo[54610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:25 compute-2 python3.9[54612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:25 compute-2 sudo[54610]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:26 compute-2 sudo[54733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfywuqqgqdqyfgupumiioqkjxowelcqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089165.1210947-64-152240516221048/AnsiballZ_copy.py'
Oct 10 09:39:26 compute-2 sudo[54733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:26 compute-2 python3.9[54735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089165.1210947-64-152240516221048/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:26 compute-2 sudo[54733]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:26 compute-2 sshd-session[54308]: Connection closed by 192.168.122.30 port 51964
Oct 10 09:39:26 compute-2 sshd-session[54305]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:39:26 compute-2 systemd[1]: session-14.scope: Deactivated successfully.
Oct 10 09:39:26 compute-2 systemd[1]: session-14.scope: Consumed 1.570s CPU time.
Oct 10 09:39:26 compute-2 systemd-logind[796]: Session 14 logged out. Waiting for processes to exit.
Oct 10 09:39:26 compute-2 systemd-logind[796]: Removed session 14.
Oct 10 09:39:32 compute-2 sshd-session[54760]: Accepted publickey for zuul from 192.168.122.30 port 51970 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:39:32 compute-2 systemd-logind[796]: New session 15 of user zuul.
Oct 10 09:39:32 compute-2 systemd[1]: Started Session 15 of User zuul.
Oct 10 09:39:32 compute-2 sshd-session[54760]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:39:33 compute-2 python3.9[54913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:39:34 compute-2 sudo[55067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tweqxubzvkdolqfsvyfdwijtfhgtngjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089173.8532472-61-271410893777122/AnsiballZ_file.py'
Oct 10 09:39:34 compute-2 sudo[55067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:34 compute-2 python3.9[55069]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:34 compute-2 sudo[55067]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:35 compute-2 sudo[55242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqzlmgjbghexwteujiqdddhfczfpoer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089174.75691-85-137419652153510/AnsiballZ_stat.py'
Oct 10 09:39:35 compute-2 sudo[55242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:35 compute-2 python3.9[55244]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:35 compute-2 sudo[55242]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:36 compute-2 sudo[55365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjggbygcegdeymkctazyscqrldbtwbzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089174.75691-85-137419652153510/AnsiballZ_copy.py'
Oct 10 09:39:36 compute-2 sudo[55365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:36 compute-2 python3.9[55367]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1760089174.75691-85-137419652153510/.source.json _original_basename=.omsu_u7_ follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:36 compute-2 sudo[55365]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:37 compute-2 sudo[55517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbligernsdtvcvqsygmwkduhunehihu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089176.8322563-154-102924880549786/AnsiballZ_stat.py'
Oct 10 09:39:37 compute-2 sudo[55517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:37 compute-2 python3.9[55519]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:37 compute-2 sudo[55517]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:37 compute-2 sudo[55640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nooghqhvrtktzzypywxfoopswonvvimf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089176.8322563-154-102924880549786/AnsiballZ_copy.py'
Oct 10 09:39:37 compute-2 sudo[55640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:37 compute-2 python3.9[55642]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089176.8322563-154-102924880549786/.source _original_basename=.5nbob3pq follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:37 compute-2 sudo[55640]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:38 compute-2 sudo[55792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epplegyivjgjrtuyptzehkgmngehtbmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089178.3596864-203-161514485366824/AnsiballZ_file.py'
Oct 10 09:39:38 compute-2 sudo[55792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:38 compute-2 python3.9[55794]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:39:38 compute-2 sudo[55792]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:39 compute-2 sudo[55944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jofonqblcqpkxiplmcteaypxcbsjufvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089179.2197025-226-54817425253456/AnsiballZ_stat.py'
Oct 10 09:39:39 compute-2 sudo[55944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:39 compute-2 python3.9[55946]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:39 compute-2 sudo[55944]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:40 compute-2 sudo[56067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfpntojxrxgzkylphtzaxtzptbirbnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089179.2197025-226-54817425253456/AnsiballZ_copy.py'
Oct 10 09:39:40 compute-2 sudo[56067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:40 compute-2 python3.9[56069]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089179.2197025-226-54817425253456/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:39:40 compute-2 sudo[56067]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:40 compute-2 sudo[56219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zapetyvgsxjovmeqozlarvuvbqhtqsmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089180.429285-226-196869450885258/AnsiballZ_stat.py'
Oct 10 09:39:40 compute-2 sudo[56219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:40 compute-2 python3.9[56221]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:40 compute-2 sudo[56219]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:41 compute-2 sudo[56342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xljdvwodjvbjzhnyujyewkgzkzcymipq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089180.429285-226-196869450885258/AnsiballZ_copy.py'
Oct 10 09:39:41 compute-2 sudo[56342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:41 compute-2 python3.9[56344]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089180.429285-226-196869450885258/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:39:41 compute-2 sudo[56342]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:42 compute-2 sudo[56494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myuatgvndbynyfgncvakfzqfutwrhwyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089182.1471314-313-236858903971383/AnsiballZ_file.py'
Oct 10 09:39:42 compute-2 sudo[56494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:42 compute-2 python3.9[56496]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:42 compute-2 sudo[56494]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:43 compute-2 sudo[56646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-culfploresvdcirhyhzumbnwfmfmjsbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089182.9457266-338-65113596835969/AnsiballZ_stat.py'
Oct 10 09:39:43 compute-2 sudo[56646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:43 compute-2 python3.9[56648]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:43 compute-2 sudo[56646]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:43 compute-2 sudo[56769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pujfdkknuejkykjwemltaqzflobystys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089182.9457266-338-65113596835969/AnsiballZ_copy.py'
Oct 10 09:39:43 compute-2 sudo[56769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:44 compute-2 python3.9[56771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089182.9457266-338-65113596835969/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:44 compute-2 sudo[56769]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:44 compute-2 sudo[56921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hodeactgezsbcljbfuvqvtpzutdfzhpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089184.3568327-383-210677791656687/AnsiballZ_stat.py'
Oct 10 09:39:44 compute-2 sudo[56921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:44 compute-2 python3.9[56923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:44 compute-2 sudo[56921]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:45 compute-2 sudo[57044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeivgdfxkvzmhodrhaawsheklycclive ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089184.3568327-383-210677791656687/AnsiballZ_copy.py'
Oct 10 09:39:45 compute-2 sudo[57044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:45 compute-2 python3.9[57046]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089184.3568327-383-210677791656687/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:45 compute-2 sudo[57044]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:46 compute-2 sudo[57196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvugmqmljwgtyzrffvqmhunjtmhpfped ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089185.7234306-428-230408172872864/AnsiballZ_systemd.py'
Oct 10 09:39:46 compute-2 sudo[57196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:46 compute-2 python3.9[57198]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:39:46 compute-2 systemd[1]: Reloading.
Oct 10 09:39:46 compute-2 systemd-sysv-generator[57227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:39:46 compute-2 systemd-rc-local-generator[57224]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:39:46 compute-2 systemd[1]: Reloading.
Oct 10 09:39:46 compute-2 systemd-rc-local-generator[57263]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:39:46 compute-2 systemd-sysv-generator[57266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:39:47 compute-2 systemd[1]: Starting EDPM Container Shutdown...
Oct 10 09:39:47 compute-2 systemd[1]: Finished EDPM Container Shutdown.
Oct 10 09:39:47 compute-2 sudo[57196]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:47 compute-2 sudo[57423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwvfseuatryqgydkfqkmuaxmxajsezzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089187.425473-452-9103470690166/AnsiballZ_stat.py'
Oct 10 09:39:47 compute-2 sudo[57423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:47 compute-2 python3.9[57425]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:47 compute-2 sudo[57423]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:48 compute-2 sudo[57546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxphhynlmmwtcbrplacgysvzpatbxkay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089187.425473-452-9103470690166/AnsiballZ_copy.py'
Oct 10 09:39:48 compute-2 sudo[57546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:48 compute-2 python3.9[57548]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089187.425473-452-9103470690166/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:48 compute-2 sudo[57546]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:49 compute-2 sudo[57698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsbjazcoehrbudvvidryxvzmargmtgjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089188.8040833-496-172739657953163/AnsiballZ_stat.py'
Oct 10 09:39:49 compute-2 sudo[57698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:49 compute-2 python3.9[57700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:39:49 compute-2 sudo[57698]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:49 compute-2 sudo[57821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmrylgjiylkoemfcpxmyxudlgfbopjpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089188.8040833-496-172739657953163/AnsiballZ_copy.py'
Oct 10 09:39:49 compute-2 sudo[57821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:49 compute-2 python3.9[57823]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089188.8040833-496-172739657953163/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:39:49 compute-2 sudo[57821]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:50 compute-2 sudo[57973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuymehyoeceugniyotkurjluyfmhongn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089190.2541602-542-192601879101393/AnsiballZ_systemd.py'
Oct 10 09:39:50 compute-2 sudo[57973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:51 compute-2 python3.9[57975]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:39:51 compute-2 systemd[1]: Reloading.
Oct 10 09:39:51 compute-2 systemd-rc-local-generator[58003]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:39:51 compute-2 systemd-sysv-generator[58007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:39:51 compute-2 systemd[1]: Reloading.
Oct 10 09:39:51 compute-2 systemd-rc-local-generator[58037]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:39:51 compute-2 systemd-sysv-generator[58042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:39:51 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 09:39:51 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 09:39:51 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 09:39:51 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 09:39:51 compute-2 sudo[57973]: pam_unix(sudo:session): session closed for user root
Oct 10 09:39:53 compute-2 python3.9[58201]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:39:53 compute-2 network[58218]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:39:53 compute-2 network[58219]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:39:53 compute-2 network[58220]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:39:59 compute-2 sudo[58482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrmupssabpkjqhoturwtmtosezrurbwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089198.730881-589-252047127196569/AnsiballZ_systemd.py'
Oct 10 09:39:59 compute-2 sudo[58482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:39:59 compute-2 python3.9[58484]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:39:59 compute-2 systemd[1]: Reloading.
Oct 10 09:39:59 compute-2 systemd-rc-local-generator[58514]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:39:59 compute-2 systemd-sysv-generator[58519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:39:59 compute-2 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 10 09:39:59 compute-2 iptables.init[58525]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 10 09:40:00 compute-2 iptables.init[58525]: iptables: Flushing firewall rules: [  OK  ]
Oct 10 09:40:00 compute-2 systemd[1]: iptables.service: Deactivated successfully.
Oct 10 09:40:00 compute-2 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 10 09:40:00 compute-2 sudo[58482]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:00 compute-2 sudo[58720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdeifceitjpqumpzsnnjjoplhotrynzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089200.2557187-589-196050407977007/AnsiballZ_systemd.py'
Oct 10 09:40:00 compute-2 sudo[58720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:00 compute-2 python3.9[58722]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:40:01 compute-2 sudo[58720]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:01 compute-2 sudo[58874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akkgqchdamwhpoorfgbtpnjwsxrgrlfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089201.3695288-637-279462472061509/AnsiballZ_systemd.py'
Oct 10 09:40:01 compute-2 sudo[58874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:01 compute-2 python3.9[58876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:40:02 compute-2 systemd[1]: Reloading.
Oct 10 09:40:02 compute-2 systemd-sysv-generator[58909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:40:02 compute-2 systemd-rc-local-generator[58905]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:40:02 compute-2 systemd[1]: Starting Netfilter Tables...
Oct 10 09:40:02 compute-2 systemd[1]: Finished Netfilter Tables.
Oct 10 09:40:02 compute-2 sudo[58874]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:03 compute-2 sudo[59066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmwmxvytqdwgjlfhqjrpkduhgrzusvmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089202.818422-661-170032539395767/AnsiballZ_command.py'
Oct 10 09:40:03 compute-2 sudo[59066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:03 compute-2 python3.9[59068]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:40:03 compute-2 sudo[59066]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:04 compute-2 sudo[59219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgtujjfyaogewfenlwrfdohygxdpsdgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089204.096978-704-157209696244298/AnsiballZ_stat.py'
Oct 10 09:40:04 compute-2 sudo[59219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:04 compute-2 python3.9[59221]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:04 compute-2 sudo[59219]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:05 compute-2 sudo[59344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpjusqcgnkiwltgnkuhucajnpcxanjrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089204.096978-704-157209696244298/AnsiballZ_copy.py'
Oct 10 09:40:05 compute-2 sudo[59344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:05 compute-2 python3.9[59346]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089204.096978-704-157209696244298/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:05 compute-2 sudo[59344]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:06 compute-2 python3.9[59497]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:40:06 compute-2 polkitd[7343]: Registered Authentication Agent for unix-process:59499:210983 (system bus name :1.525 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 10 09:40:31 compute-2 polkit-agent-helper-1[59511]: pam_unix(polkit-1:auth): conversation failed
Oct 10 09:40:31 compute-2 polkit-agent-helper-1[59511]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 10 09:40:31 compute-2 polkitd[7343]: Unregistered Authentication Agent for unix-process:59499:210983 (system bus name :1.525, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 10 09:40:31 compute-2 polkitd[7343]: Operator of unix-process:59499:210983 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.524 [<unknown>] (owned by unix-user:zuul)
Oct 10 09:40:31 compute-2 sshd-session[54763]: Connection closed by 192.168.122.30 port 51970
Oct 10 09:40:31 compute-2 sshd-session[54760]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:40:31 compute-2 systemd[1]: session-15.scope: Deactivated successfully.
Oct 10 09:40:31 compute-2 systemd[1]: session-15.scope: Consumed 20.405s CPU time.
Oct 10 09:40:31 compute-2 systemd-logind[796]: Session 15 logged out. Waiting for processes to exit.
Oct 10 09:40:31 compute-2 systemd-logind[796]: Removed session 15.
Oct 10 09:40:43 compute-2 sshd-session[59537]: Accepted publickey for zuul from 192.168.122.30 port 47370 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:40:43 compute-2 systemd-logind[796]: New session 16 of user zuul.
Oct 10 09:40:44 compute-2 systemd[1]: Started Session 16 of User zuul.
Oct 10 09:40:44 compute-2 sshd-session[59537]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:40:45 compute-2 python3.9[59690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:40:45 compute-2 sudo[59844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikwltteacuphquwflxhxbykuzfhvhzaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089245.5506353-61-189793370982695/AnsiballZ_file.py'
Oct 10 09:40:45 compute-2 sudo[59844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:46 compute-2 python3.9[59846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:46 compute-2 sudo[59844]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:46 compute-2 sudo[60019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdakgsxcctsuxoatryznswqdxyiqguxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089246.4750767-85-68803257461714/AnsiballZ_stat.py'
Oct 10 09:40:46 compute-2 sudo[60019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:47 compute-2 python3.9[60021]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:47 compute-2 sudo[60019]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:47 compute-2 sudo[60097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwobcokyiycjhiiixuzpljogeyollbkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089246.4750767-85-68803257461714/AnsiballZ_file.py'
Oct 10 09:40:47 compute-2 sudo[60097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:47 compute-2 python3.9[60099]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xwfn6yqr recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:47 compute-2 sudo[60097]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:48 compute-2 sudo[60249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chutswituhqwboeqqqjrriuevfakxgll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089248.2942297-146-22350998604167/AnsiballZ_stat.py'
Oct 10 09:40:48 compute-2 sudo[60249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:48 compute-2 python3.9[60251]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:48 compute-2 sudo[60249]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:48 compute-2 sudo[60327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idfklqzpksytxlifyrqpqevifpdqnhuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089248.2942297-146-22350998604167/AnsiballZ_file.py'
Oct 10 09:40:48 compute-2 sudo[60327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:49 compute-2 python3.9[60329]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.xbt1rhul recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:49 compute-2 sudo[60327]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:49 compute-2 sudo[60479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnqtuhfbhdbqajjncuaoylhynippjhls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089249.67956-184-65529321625017/AnsiballZ_file.py'
Oct 10 09:40:49 compute-2 sudo[60479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:50 compute-2 python3.9[60481]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:40:50 compute-2 sudo[60479]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:50 compute-2 sudo[60631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzzbxbmswhzeptrjmhpbamexcagkclew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089250.4866393-208-187173699732396/AnsiballZ_stat.py'
Oct 10 09:40:50 compute-2 sudo[60631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:50 compute-2 python3.9[60633]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:51 compute-2 sudo[60631]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:51 compute-2 sudo[60709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abixiyhtlekarskdmccxvyjtlcajxyex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089250.4866393-208-187173699732396/AnsiballZ_file.py'
Oct 10 09:40:51 compute-2 sudo[60709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:51 compute-2 python3.9[60711]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:40:51 compute-2 sudo[60709]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:51 compute-2 sudo[60861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-encopqfespkbimvrpmcuvfcxqshhyiwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089251.56536-208-267697687206376/AnsiballZ_stat.py'
Oct 10 09:40:51 compute-2 sudo[60861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:51 compute-2 python3.9[60863]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:52 compute-2 sudo[60861]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:52 compute-2 sudo[60939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nllcjbryokcaphiribhlreoewsguqyhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089251.56536-208-267697687206376/AnsiballZ_file.py'
Oct 10 09:40:52 compute-2 sudo[60939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:52 compute-2 python3.9[60941]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:40:52 compute-2 sudo[60939]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:53 compute-2 sudo[61091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieffiryyozmnvppquncpmyghaldxwqdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089253.065998-278-212659372139669/AnsiballZ_file.py'
Oct 10 09:40:53 compute-2 sudo[61091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:53 compute-2 python3.9[61093]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:53 compute-2 sudo[61091]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:54 compute-2 sudo[61243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btiaaufwgwyjbghqlupjhvunvlmspdmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089253.8470967-301-2069784369229/AnsiballZ_stat.py'
Oct 10 09:40:54 compute-2 sudo[61243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:54 compute-2 python3.9[61245]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:54 compute-2 sudo[61243]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:54 compute-2 sudo[61321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdhrodrrnhimqwxvcucoaqnzcmjnttvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089253.8470967-301-2069784369229/AnsiballZ_file.py'
Oct 10 09:40:54 compute-2 sudo[61321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:54 compute-2 python3.9[61323]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:54 compute-2 sudo[61321]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:55 compute-2 sudo[61474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvhhstnvppkwbydyloveunefnsrvhfwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089255.1333423-337-224740973079954/AnsiballZ_stat.py'
Oct 10 09:40:55 compute-2 sudo[61474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:55 compute-2 python3.9[61476]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:55 compute-2 sudo[61474]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:55 compute-2 sudo[61552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfmswkwvzcsdgxwyhxethtgoknqffkxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089255.1333423-337-224740973079954/AnsiballZ_file.py'
Oct 10 09:40:55 compute-2 sudo[61552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:56 compute-2 python3.9[61554]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:56 compute-2 sudo[61552]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:57 compute-2 sudo[61704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpcjjcwsiliwglcpbpuvyyavgqxmybid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089256.4110293-373-55439950694487/AnsiballZ_systemd.py'
Oct 10 09:40:57 compute-2 sudo[61704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:57 compute-2 python3.9[61706]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:40:57 compute-2 systemd[1]: Reloading.
Oct 10 09:40:57 compute-2 systemd-rc-local-generator[61732]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:40:57 compute-2 systemd-sysv-generator[61735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:40:57 compute-2 sudo[61704]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:58 compute-2 sudo[61892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hexgsvhaclbspwvksypgkbnsrmlkecsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089258.0112855-397-74510789613366/AnsiballZ_stat.py'
Oct 10 09:40:58 compute-2 sudo[61892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:58 compute-2 python3.9[61894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:58 compute-2 sudo[61892]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:58 compute-2 sudo[61970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfhdywykoyozpuhbumxahliujohhuadw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089258.0112855-397-74510789613366/AnsiballZ_file.py'
Oct 10 09:40:58 compute-2 sudo[61970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:59 compute-2 python3.9[61972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:40:59 compute-2 sudo[61970]: pam_unix(sudo:session): session closed for user root
Oct 10 09:40:59 compute-2 sudo[62122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvyjcahcnwxjdrkavpkntfshnvvvplij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089259.4007545-434-87499827996177/AnsiballZ_stat.py'
Oct 10 09:40:59 compute-2 sudo[62122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:40:59 compute-2 python3.9[62124]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:40:59 compute-2 sudo[62122]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:00 compute-2 sudo[62200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzzvqdoxqxnwxhddmregybrxuwrvxhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089259.4007545-434-87499827996177/AnsiballZ_file.py'
Oct 10 09:41:00 compute-2 sudo[62200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:00 compute-2 python3.9[62202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:00 compute-2 sudo[62200]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:01 compute-2 sudo[62352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adcydwuokbrbmvgbdpivwwncewehxgac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089260.8066163-470-29110061708075/AnsiballZ_systemd.py'
Oct 10 09:41:01 compute-2 sudo[62352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:01 compute-2 python3.9[62354]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:41:01 compute-2 systemd[1]: Reloading.
Oct 10 09:41:01 compute-2 systemd-rc-local-generator[62385]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:41:01 compute-2 systemd-sysv-generator[62388]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:41:01 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 09:41:01 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 09:41:01 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 09:41:01 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 09:41:01 compute-2 sudo[62352]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:02 compute-2 python3.9[62545]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:41:02 compute-2 network[62562]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:41:02 compute-2 network[62563]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:41:02 compute-2 network[62564]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:41:07 compute-2 sudo[62825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvcxlvsvgtmopmwpxegoxbwkovkqretp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089267.2422676-548-186079940175701/AnsiballZ_stat.py'
Oct 10 09:41:07 compute-2 sudo[62825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:07 compute-2 python3.9[62827]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:07 compute-2 sudo[62825]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:08 compute-2 sudo[62903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxfsnetbwidvmginankinaighgpylzrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089267.2422676-548-186079940175701/AnsiballZ_file.py'
Oct 10 09:41:08 compute-2 sudo[62903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:08 compute-2 python3.9[62905]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:08 compute-2 sudo[62903]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:08 compute-2 sudo[63055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neezuhnybnyzmmhydtnqksgkzgzpkomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089268.7358415-587-273152600571676/AnsiballZ_file.py'
Oct 10 09:41:08 compute-2 sudo[63055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:09 compute-2 python3.9[63057]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:09 compute-2 sudo[63055]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:09 compute-2 sudo[63207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvxoktnicqhqygfkjeanfczwuatqqlfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089269.5043862-611-154740142628858/AnsiballZ_stat.py'
Oct 10 09:41:09 compute-2 sudo[63207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:09 compute-2 python3.9[63209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:09 compute-2 sudo[63207]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:10 compute-2 sudo[63330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alcjgpmhwdhzmjlzrpxmqqigvrvjvvmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089269.5043862-611-154740142628858/AnsiballZ_copy.py'
Oct 10 09:41:10 compute-2 sudo[63330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:10 compute-2 python3.9[63332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089269.5043862-611-154740142628858/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:10 compute-2 sudo[63330]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:11 compute-2 sudo[63482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrferknenpvjyrqgmzavisachlbawxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089271.1750855-665-37593015184277/AnsiballZ_timezone.py'
Oct 10 09:41:11 compute-2 sudo[63482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:11 compute-2 python3.9[63484]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 09:41:11 compute-2 systemd[1]: Starting Time & Date Service...
Oct 10 09:41:11 compute-2 systemd[1]: Started Time & Date Service.
Oct 10 09:41:11 compute-2 sudo[63482]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:12 compute-2 sudo[63638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ketyvjmkamwwacsjjtcktcfvuljahbcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089272.6107316-692-186169511067053/AnsiballZ_file.py'
Oct 10 09:41:12 compute-2 sudo[63638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:13 compute-2 python3.9[63640]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:13 compute-2 sudo[63638]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:13 compute-2 sudo[63790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyikqsbvdzemgdoedrdvlclerlbdsanb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089273.3465185-715-137411755594630/AnsiballZ_stat.py'
Oct 10 09:41:13 compute-2 sudo[63790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:13 compute-2 python3.9[63792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:13 compute-2 sudo[63790]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:14 compute-2 sudo[63913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utcnqmldkkfihjkusaqpdsjqvumszmrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089273.3465185-715-137411755594630/AnsiballZ_copy.py'
Oct 10 09:41:14 compute-2 sudo[63913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:14 compute-2 python3.9[63915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089273.3465185-715-137411755594630/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:14 compute-2 sudo[63913]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:15 compute-2 sudo[64065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjujpvsvthophzgseobsrgbbycqynqqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089274.887268-760-28169979534399/AnsiballZ_stat.py'
Oct 10 09:41:15 compute-2 sudo[64065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:15 compute-2 python3.9[64067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:15 compute-2 sudo[64065]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:15 compute-2 sudo[64188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylosgjfsthkghaqgzmvghqeeohaeaxtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089274.887268-760-28169979534399/AnsiballZ_copy.py'
Oct 10 09:41:15 compute-2 sudo[64188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:15 compute-2 python3.9[64190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089274.887268-760-28169979534399/.source.yaml _original_basename=.fbb2i29c follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:16 compute-2 sudo[64188]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:16 compute-2 sudo[64341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqpejnmrwziyirgpppsrgmzxinnldfuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089276.3215218-806-162386162924705/AnsiballZ_stat.py'
Oct 10 09:41:16 compute-2 sudo[64341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:16 compute-2 python3.9[64343]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:16 compute-2 sudo[64341]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:17 compute-2 sudo[64464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnuguchlcjddwklvgcwqeurccammnqqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089276.3215218-806-162386162924705/AnsiballZ_copy.py'
Oct 10 09:41:17 compute-2 sudo[64464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:17 compute-2 python3.9[64466]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089276.3215218-806-162386162924705/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:17 compute-2 sudo[64464]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:18 compute-2 sudo[64616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnbmhxzsyvsdeaupoxlxgysidcgmozgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089277.861357-851-157736650505764/AnsiballZ_command.py'
Oct 10 09:41:18 compute-2 sudo[64616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:18 compute-2 python3.9[64618]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:18 compute-2 sudo[64616]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:19 compute-2 sudo[64769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxjzmmwmadzobskntwvnoqjkmxaydhpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089278.8310413-875-66878208263509/AnsiballZ_command.py'
Oct 10 09:41:19 compute-2 sudo[64769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:19 compute-2 python3.9[64771]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:19 compute-2 sudo[64769]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:20 compute-2 sudo[64922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovdqvyijeqxidnqhjxycvzhvfnxesfu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760089279.7342978-898-261954283131385/AnsiballZ_edpm_nftables_from_files.py'
Oct 10 09:41:20 compute-2 sudo[64922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:20 compute-2 python3[64924]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 09:41:20 compute-2 sudo[64922]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:21 compute-2 sudo[65074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brluqbkiiaknnfhrglsgwgaapfsosvsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089280.8458064-923-97021584640293/AnsiballZ_stat.py'
Oct 10 09:41:21 compute-2 sudo[65074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:21 compute-2 python3.9[65076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:21 compute-2 sudo[65074]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:21 compute-2 sudo[65197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizeieykhiqffaifsgietmxutoavdkre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089280.8458064-923-97021584640293/AnsiballZ_copy.py'
Oct 10 09:41:21 compute-2 sudo[65197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:21 compute-2 python3.9[65199]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089280.8458064-923-97021584640293/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:21 compute-2 sudo[65197]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:22 compute-2 sudo[65349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myautsiuolmozufleknplfyzsqbficdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089282.419474-968-55220374828843/AnsiballZ_stat.py'
Oct 10 09:41:22 compute-2 sudo[65349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:22 compute-2 python3.9[65351]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:22 compute-2 sudo[65349]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:23 compute-2 sudo[65472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akuyklhbmhmysvoykjhcygugwxqwhvqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089282.419474-968-55220374828843/AnsiballZ_copy.py'
Oct 10 09:41:23 compute-2 sudo[65472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:23 compute-2 python3.9[65474]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089282.419474-968-55220374828843/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:23 compute-2 sudo[65472]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:24 compute-2 sudo[65624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvdflgxbtwktlwjxvuvreewrvcckberk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089283.9498124-1013-3757482558625/AnsiballZ_stat.py'
Oct 10 09:41:24 compute-2 sudo[65624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:24 compute-2 python3.9[65626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:24 compute-2 sudo[65624]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:24 compute-2 sudo[65747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svvfpklzthfreidfftdxwooabmjdauii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089283.9498124-1013-3757482558625/AnsiballZ_copy.py'
Oct 10 09:41:24 compute-2 sudo[65747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:25 compute-2 python3.9[65749]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089283.9498124-1013-3757482558625/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:25 compute-2 sudo[65747]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:25 compute-2 sudo[65899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lredskfsgtbbieeaeemoejzbuqvgdhcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089285.580667-1058-273865495106653/AnsiballZ_stat.py'
Oct 10 09:41:25 compute-2 sudo[65899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:26 compute-2 python3.9[65901]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:26 compute-2 sudo[65899]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:26 compute-2 sudo[66022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbupdvuvvnvwncmrwrdgmshsqccgfjfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089285.580667-1058-273865495106653/AnsiballZ_copy.py'
Oct 10 09:41:26 compute-2 sudo[66022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:26 compute-2 chronyd[54279]: Selected source 142.4.192.253 (pool.ntp.org)
Oct 10 09:41:26 compute-2 python3.9[66024]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089285.580667-1058-273865495106653/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:26 compute-2 sudo[66022]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:27 compute-2 sudo[66174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwgxrvtnnllsmsgmjglrxzuwxopycomu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089287.0687494-1103-98774151432424/AnsiballZ_stat.py'
Oct 10 09:41:27 compute-2 sudo[66174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:27 compute-2 python3.9[66176]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:41:27 compute-2 sudo[66174]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:28 compute-2 sudo[66297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nksuamvwdngzxqydsccnytddfwzsteqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089287.0687494-1103-98774151432424/AnsiballZ_copy.py'
Oct 10 09:41:28 compute-2 sudo[66297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:28 compute-2 python3.9[66299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089287.0687494-1103-98774151432424/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:28 compute-2 sudo[66297]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:29 compute-2 sudo[66449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjgppjwenskgjngpyoxhiocqnumduypt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089288.7489011-1148-101244750723146/AnsiballZ_file.py'
Oct 10 09:41:29 compute-2 sudo[66449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:29 compute-2 python3.9[66451]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:29 compute-2 sudo[66449]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:29 compute-2 sudo[66601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glarisaebmkdzrneyumzywazxvtikzof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089289.554153-1172-91708718639894/AnsiballZ_command.py'
Oct 10 09:41:29 compute-2 sudo[66601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:30 compute-2 python3.9[66603]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:30 compute-2 sudo[66601]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:30 compute-2 sudo[66760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weehbyxycrepnssorclzwlfqjtlvymqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089290.5122776-1196-199720532588428/AnsiballZ_blockinfile.py'
Oct 10 09:41:30 compute-2 sudo[66760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:31 compute-2 python3.9[66762]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:31 compute-2 sudo[66760]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:31 compute-2 sudo[66913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cekdraatutzdarfmgqxvhzoobhxgxnry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089291.6490333-1223-186100453628353/AnsiballZ_file.py'
Oct 10 09:41:31 compute-2 sudo[66913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:32 compute-2 python3.9[66915]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:32 compute-2 sudo[66913]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:32 compute-2 sudo[67065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzystnwuqxmpgrkyvacwtxqbtpsqovuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089292.324372-1223-189789437581501/AnsiballZ_file.py'
Oct 10 09:41:32 compute-2 sudo[67065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:32 compute-2 python3.9[67067]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:32 compute-2 sudo[67065]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:33 compute-2 sudo[67217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izztijbakazqzmzkuhwpyydhbjeuhdjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089293.2514167-1267-244594144328036/AnsiballZ_mount.py'
Oct 10 09:41:33 compute-2 sudo[67217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:33 compute-2 python3.9[67219]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 09:41:33 compute-2 sudo[67217]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:34 compute-2 sudo[67370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzmjnspesrfqvunsyulsmjiwimkftxra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089294.0623972-1267-56109802206601/AnsiballZ_mount.py'
Oct 10 09:41:34 compute-2 sudo[67370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:34 compute-2 python3.9[67372]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 09:41:34 compute-2 sudo[67370]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:35 compute-2 sshd-session[59540]: Connection closed by 192.168.122.30 port 47370
Oct 10 09:41:35 compute-2 sshd-session[59537]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:41:35 compute-2 systemd-logind[796]: Session 16 logged out. Waiting for processes to exit.
Oct 10 09:41:35 compute-2 systemd[1]: session-16.scope: Deactivated successfully.
Oct 10 09:41:35 compute-2 systemd[1]: session-16.scope: Consumed 31.794s CPU time.
Oct 10 09:41:35 compute-2 systemd-logind[796]: Removed session 16.
Oct 10 09:41:41 compute-2 sshd-session[67398]: Accepted publickey for zuul from 192.168.122.30 port 49244 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:41:41 compute-2 systemd-logind[796]: New session 17 of user zuul.
Oct 10 09:41:41 compute-2 systemd[1]: Started Session 17 of User zuul.
Oct 10 09:41:41 compute-2 sshd-session[67398]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:41:41 compute-2 sudo[67551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnkpokqyahdkekeykgmhwehaokigmkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089301.3833635-20-186521875603281/AnsiballZ_tempfile.py'
Oct 10 09:41:41 compute-2 sudo[67551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:41 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 09:41:42 compute-2 python3.9[67553]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 10 09:41:42 compute-2 sudo[67551]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:42 compute-2 sudo[67705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzqoffkhkqlydflugbqbljcnxerpdhyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089302.423839-56-215554656478964/AnsiballZ_stat.py'
Oct 10 09:41:42 compute-2 sudo[67705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:43 compute-2 python3.9[67707]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:41:43 compute-2 sudo[67705]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:43 compute-2 sudo[67857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgwtnbitbtghlvnenknxfazlbpkukpeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089303.3956127-86-226782665191482/AnsiballZ_setup.py'
Oct 10 09:41:43 compute-2 sudo[67857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:44 compute-2 python3.9[67859]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:41:44 compute-2 sudo[67857]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:45 compute-2 sudo[68009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndzdmwvdsimswzviuidsorgmuvosmghm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089304.6557562-111-244661351684124/AnsiballZ_blockinfile.py'
Oct 10 09:41:45 compute-2 sudo[68009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:45 compute-2 python3.9[68011]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs576V3VvbSgv48Ml4JM3ripPY5VUVh8vdkDr1njjfd7J/WrQQkTf/D0b7+eGTXj3Y1fx1/haVrDafo7g0NqcSZX+zNUgTCnYPWafo7RMG4Q7ITVk1NPIkAC1cDUxHNeWhXaOkxCz96sTkO4aNW3uoFjsp2JkJtRJmHzT7q/bc0N9x7YcWh9vwRRBiOKlV8cWMHuHUzOlloEQLN67Dht1xHWr1eO/SITqUlWY13tc/54xQuo8nBQNNX9ArhMbJz2a9AoNVUAAYFF8hWFI5ES/GL9qsCp8dnmAtrY4Rc07QmHo1RkcjXe1f6D+vymRIP3YOqIjlWp0blCTfcCGno5lBa9f5JachIsogk+5+GYx4AAbWLyxxecfKzdCxrGnQlfFgldc1xDN1RG+8HwFEAuHQDWTCDUgF67FXSHy7aVxrdzU4046193/o3VKTpSaJmFldASxFgyUeujs56OgC0qYM0zKV4jOsMBcocVHvH/1FOPWIr81XXYvu6C/Ntd6sBj0=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGSf7pFS/S1SmUMk/yMobwR+LTaQZlAhBqo7Ido5r8dg
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB1l0EOuMseZ7ulHkfzzVtKv+5A9EWRy+oXVB+t370vohhJoN3+lviS8xoR8GttJUcHVCaeioniRtOWysbNdC0I=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDarlOcgDXqRdSww3oIuqu7nGIBJToNGSnU1ljOr6GTlHTxxOoTztIrvZrPaJA8w/ixztkhFZZSdRPw4meYayY05CNu9SneiL62twzDLDsqeDPAspkh69Ljj5aGCLf6GJDiK0m2h1jLDIFtXH3lIQE9781zA7ZQ8+/xeF4yRS1/Fb5CXDG+oi/J0veCffs6t0TYmrUfSgS2H2y0UxNu7C6GoQKRde1arPLOYexvlg2RjlWM6Ex4JCqTAd9EN330Kh4HUr3r46ET8mwi1mPndibbiW0heXgrg8FeV5hBqOxQsGgLEKpX1cNAz6Rr0C5Hg1xfGcsJtep88vbJFmMyV1jNowDtJCYpprqa16Nj35HBuuz7zbzVlIdeQhEJ9I4I7eNhUxlb2/XYRXy2hfsrM9D2TP7B+bVPLjlqgqy8stBhGBCtH32ppNsXHE6uGPHMovcz2VhbP/P3sp9NQV+hF2Q0RbBXrQZkEI9YJdhxQw5hyOqwfPrEEBFy8FpzSKfBAW0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC1nQuW/lbxVJxo9H20J7i0+Z6cHtufrF4VbA6zs724f
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0oTxSrAqx34tAubl7rouYPI7qhs6NhoDmGr3PTW1+mypEQw0EO+pZ99zSRnweC5RBoL080AgUKo7KN+v3LDHw=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUnwO+j5aInA4FKMx5pWF8B0Zp6L17GsYV5RBbu6iT67LtXjwbz5nP4EC7t80boMHnS7DRNCAxF0FNMVhQ9o4+1E1n2mrUxxAw8YxcZTabu/lAqRb4I6RzmXdXSA9mF8O3onswi/KhJg6YUTFEWCuxWrMLco15IatKi+hNqcRUk1DreR2L/YN0W5qXkvj1z3aoph1h3Yn1lRjuQDrVHp6lCywixC2pHwYG+CrPyX+0PkXJg+JRvRdxNCIw0D0zOkJrnppmT8XpIj42JLRUGGV592XFVXHiEhZdOI2bdzPy490EfIbWF9Symqi/V5vf8SK9LMOscHXkD7jsT6VKzsUXyk6/IzzZ2TzhD173lt8HpRJyaZq4ME0ZSVYNyD58DN/CQ3xpO1c1E8Wp4fUswc4WHmb/eILnY0lDXOZt6Hb/e+K6RHu5e5GOo0KSfei/LyrqJkBQn2P8UkbJvrUh2bNw+whjvT5CmXd3rPCw+Xq3/K3Gpit1K/4pC0zGC+CQr7E=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILklS4uW4IrGY5dWZTg4VeKVeFB3jPeUpu/8f4D1+rd5
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCelD2lLiMWT09YjxTI9IfdSnHfdMuHKAAEYFKZmJg34mgwUIDqUQqoc9I6a7Ps9pRizY+UpHWL//lD7hvvhD5k=
                                             create=True mode=0644 path=/tmp/ansible.3fb8vn2c state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:45 compute-2 sudo[68009]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:45 compute-2 sudo[68161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjctkwtkhjdlijfpvmpabczohqcovypf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089305.5914574-136-86736616597070/AnsiballZ_command.py'
Oct 10 09:41:46 compute-2 sudo[68161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:46 compute-2 python3.9[68163]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.3fb8vn2c' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:46 compute-2 sudo[68161]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:46 compute-2 sudo[68315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkvaelkxbftkttxjngaekqtjyguccaon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089306.516412-159-201985548506211/AnsiballZ_file.py'
Oct 10 09:41:46 compute-2 sudo[68315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:47 compute-2 python3.9[68317]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.3fb8vn2c state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:41:47 compute-2 sudo[68315]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:47 compute-2 sshd-session[67401]: Connection closed by 192.168.122.30 port 49244
Oct 10 09:41:47 compute-2 sshd-session[67398]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:41:47 compute-2 systemd[1]: session-17.scope: Deactivated successfully.
Oct 10 09:41:47 compute-2 systemd[1]: session-17.scope: Consumed 3.529s CPU time.
Oct 10 09:41:47 compute-2 systemd-logind[796]: Session 17 logged out. Waiting for processes to exit.
Oct 10 09:41:47 compute-2 systemd-logind[796]: Removed session 17.
Oct 10 09:41:53 compute-2 sshd-session[68342]: Accepted publickey for zuul from 192.168.122.30 port 57700 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:41:53 compute-2 systemd-logind[796]: New session 18 of user zuul.
Oct 10 09:41:53 compute-2 systemd[1]: Started Session 18 of User zuul.
Oct 10 09:41:53 compute-2 sshd-session[68342]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:41:54 compute-2 python3.9[68495]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:41:55 compute-2 sudo[68649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqqktkfjrgoschjeuuqhfpedvljsvopz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089314.6685424-58-183998939734609/AnsiballZ_systemd.py'
Oct 10 09:41:55 compute-2 sudo[68649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:55 compute-2 python3.9[68651]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 09:41:55 compute-2 sudo[68649]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:56 compute-2 sudo[68803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcljmzvhvolqoharkbezyucnnnrnvcjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089315.9606853-82-150725787317737/AnsiballZ_systemd.py'
Oct 10 09:41:56 compute-2 sudo[68803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:56 compute-2 python3.9[68805]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:41:56 compute-2 sudo[68803]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:57 compute-2 sudo[68956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trwkvrsrnlviynvrnyolvxqanymbpjuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089316.9200675-109-75486804839277/AnsiballZ_command.py'
Oct 10 09:41:57 compute-2 sudo[68956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:57 compute-2 python3.9[68958]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:57 compute-2 sudo[68956]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:58 compute-2 sudo[69109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfqsuhcqkxcsatcxdrpbgghdzlfophnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089317.86974-133-251557654489928/AnsiballZ_stat.py'
Oct 10 09:41:58 compute-2 sudo[69109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:58 compute-2 python3.9[69111]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:41:58 compute-2 sudo[69109]: pam_unix(sudo:session): session closed for user root
Oct 10 09:41:59 compute-2 sudo[69263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olnabtjmympocbqxcxtnhudahticwvbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089319.0172057-158-52134235081677/AnsiballZ_command.py'
Oct 10 09:41:59 compute-2 sudo[69263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:41:59 compute-2 python3.9[69265]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:41:59 compute-2 sudo[69263]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:00 compute-2 sudo[69418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohspzfufataehsqdiznurtenqnabmxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089319.851268-181-72588747079202/AnsiballZ_file.py'
Oct 10 09:42:00 compute-2 sudo[69418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:00 compute-2 python3.9[69420]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:42:00 compute-2 sudo[69418]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:01 compute-2 sshd-session[68345]: Connection closed by 192.168.122.30 port 57700
Oct 10 09:42:01 compute-2 sshd-session[68342]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:42:01 compute-2 systemd[1]: session-18.scope: Deactivated successfully.
Oct 10 09:42:01 compute-2 systemd[1]: session-18.scope: Consumed 4.313s CPU time.
Oct 10 09:42:01 compute-2 systemd-logind[796]: Session 18 logged out. Waiting for processes to exit.
Oct 10 09:42:01 compute-2 systemd-logind[796]: Removed session 18.
Oct 10 09:42:06 compute-2 sshd-session[69446]: Accepted publickey for zuul from 192.168.122.30 port 59920 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:42:06 compute-2 systemd-logind[796]: New session 19 of user zuul.
Oct 10 09:42:06 compute-2 systemd[1]: Started Session 19 of User zuul.
Oct 10 09:42:06 compute-2 sshd-session[69446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:42:07 compute-2 python3.9[69599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:42:08 compute-2 sudo[69753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvodmvubstcmxlihzgzhqjgrtjqgxpfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089328.2958398-64-176764061485977/AnsiballZ_setup.py'
Oct 10 09:42:08 compute-2 sudo[69753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:08 compute-2 python3.9[69755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:42:09 compute-2 sudo[69753]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:09 compute-2 sudo[69837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxegoxwnihkmlsqflhdyhsczprqwpmrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089328.2958398-64-176764061485977/AnsiballZ_dnf.py'
Oct 10 09:42:09 compute-2 sudo[69837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:09 compute-2 python3.9[69839]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 09:42:11 compute-2 sudo[69837]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:11 compute-2 python3.9[69990]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:42:13 compute-2 python3.9[70141]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 09:42:14 compute-2 python3.9[70291]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:42:14 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:42:14 compute-2 python3.9[70442]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:42:15 compute-2 sshd-session[69449]: Connection closed by 192.168.122.30 port 59920
Oct 10 09:42:15 compute-2 sshd-session[69446]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:42:15 compute-2 systemd[1]: session-19.scope: Deactivated successfully.
Oct 10 09:42:15 compute-2 systemd[1]: session-19.scope: Consumed 5.810s CPU time.
Oct 10 09:42:15 compute-2 systemd-logind[796]: Session 19 logged out. Waiting for processes to exit.
Oct 10 09:42:15 compute-2 systemd-logind[796]: Removed session 19.
Oct 10 09:42:23 compute-2 sshd-session[70467]: Accepted publickey for zuul from 38.102.83.82 port 56466 ssh2: RSA SHA256:RwPGCkYG1Mlcunwa9tTlXvLSrYLunSGhwxtMMuIfos4
Oct 10 09:42:23 compute-2 systemd-logind[796]: New session 20 of user zuul.
Oct 10 09:42:23 compute-2 systemd[1]: Started Session 20 of User zuul.
Oct 10 09:42:23 compute-2 sshd-session[70467]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:42:23 compute-2 sudo[70543]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxppkegmmtpykpigzjclceeccjhquues ; /usr/bin/python3'
Oct 10 09:42:23 compute-2 sudo[70543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:23 compute-2 useradd[70547]: new group: name=ceph-admin, GID=42478
Oct 10 09:42:23 compute-2 useradd[70547]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Oct 10 09:42:23 compute-2 sudo[70543]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:24 compute-2 sudo[70629]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njoqtjdfmvnntnjdiuljdaaemmfdyaio ; /usr/bin/python3'
Oct 10 09:42:24 compute-2 sudo[70629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:24 compute-2 sudo[70629]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:24 compute-2 sudo[70702]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywpxkcoxuebddqigqjqnefhknbvfmjmp ; /usr/bin/python3'
Oct 10 09:42:24 compute-2 sudo[70702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:24 compute-2 sudo[70702]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:25 compute-2 sudo[70752]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chwhmgyfrjohnyvoglvxuirzumplwcpt ; /usr/bin/python3'
Oct 10 09:42:25 compute-2 sudo[70752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:25 compute-2 sudo[70752]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:25 compute-2 sudo[70778]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnvnwjkdsovjvqcldreqzyunewuceauo ; /usr/bin/python3'
Oct 10 09:42:25 compute-2 sudo[70778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:25 compute-2 sudo[70778]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:25 compute-2 sudo[70804]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flqpkwbwgzsayruyohmghweutfdbzawm ; /usr/bin/python3'
Oct 10 09:42:26 compute-2 sudo[70804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:26 compute-2 sudo[70804]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:26 compute-2 sudo[70830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thfftnezxeaigjmcevvfaefrbypunhxa ; /usr/bin/python3'
Oct 10 09:42:26 compute-2 sudo[70830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:26 compute-2 sudo[70830]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:27 compute-2 sudo[70908]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbuanynoijrcfqtevbomgvgjmmjgdyzp ; /usr/bin/python3'
Oct 10 09:42:27 compute-2 sudo[70908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:27 compute-2 sudo[70908]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:27 compute-2 sudo[70981]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabpxqzrxcuysgcxyamzgvrliwinlpsa ; /usr/bin/python3'
Oct 10 09:42:27 compute-2 sudo[70981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:27 compute-2 sudo[70981]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:28 compute-2 sudo[71083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fskhyzatoaezmxtdbkveuonalzovqjnn ; /usr/bin/python3'
Oct 10 09:42:28 compute-2 sudo[71083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:28 compute-2 sudo[71083]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:28 compute-2 sudo[71156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgnqweulwgunzcpydbvxvonnqxmcolzr ; /usr/bin/python3'
Oct 10 09:42:28 compute-2 sudo[71156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:28 compute-2 sudo[71156]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:29 compute-2 sudo[71206]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqebavrpyrluaxknngftkcgeetkxsbbh ; /usr/bin/python3'
Oct 10 09:42:29 compute-2 sudo[71206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:29 compute-2 python3[71208]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:42:30 compute-2 sudo[71206]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:31 compute-2 sudo[71301]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfypeblsnruuizaptqvghsjolmoulpb ; /usr/bin/python3'
Oct 10 09:42:31 compute-2 sudo[71301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:31 compute-2 python3[71303]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 10 09:42:32 compute-2 sudo[71301]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:33 compute-2 sudo[71328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cftrdishlmffxfuitcwvwutzkuajiyzh ; /usr/bin/python3'
Oct 10 09:42:33 compute-2 sudo[71328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:33 compute-2 python3[71330]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:42:33 compute-2 sudo[71328]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:33 compute-2 sudo[71354]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxmupjdnnpwhsppomnwcduhajibdlhqz ; /usr/bin/python3'
Oct 10 09:42:33 compute-2 sudo[71354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:33 compute-2 python3[71356]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:42:33 compute-2 kernel: loop: module loaded
Oct 10 09:42:33 compute-2 kernel: loop3: detected capacity change from 0 to 41943040
Oct 10 09:42:33 compute-2 sudo[71354]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:33 compute-2 sudo[71389]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfzgdbaotgvspwfrzopzbxkbybqrlnpp ; /usr/bin/python3'
Oct 10 09:42:33 compute-2 sudo[71389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:34 compute-2 python3[71391]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:42:34 compute-2 lvm[71394]: PV /dev/loop3 not used.
Oct 10 09:42:34 compute-2 lvm[71396]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:42:34 compute-2 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Oct 10 09:42:34 compute-2 lvm[71399]:   1 logical volume(s) in volume group "ceph_vg0" now active
Oct 10 09:42:34 compute-2 lvm[71406]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:42:34 compute-2 lvm[71406]: VG ceph_vg0 finished
Oct 10 09:42:34 compute-2 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Oct 10 09:42:34 compute-2 sudo[71389]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:34 compute-2 sudo[71482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txukyfeiklzrthfcdwpxhmyewkxdbwkt ; /usr/bin/python3'
Oct 10 09:42:34 compute-2 sudo[71482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:34 compute-2 python3[71484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 09:42:34 compute-2 sudo[71482]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:35 compute-2 sudo[71555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xikkvidideypkhzoherzfbzzksjcycxg ; /usr/bin/python3'
Oct 10 09:42:35 compute-2 sudo[71555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:35 compute-2 python3[71557]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760089354.595812-33485-109348056988994/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:42:35 compute-2 sudo[71555]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:35 compute-2 sudo[71605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnpxlvwabmbowlajywcmaqcwvnstblf ; /usr/bin/python3'
Oct 10 09:42:35 compute-2 sudo[71605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:42:36 compute-2 python3[71607]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:42:36 compute-2 systemd[1]: Reloading.
Oct 10 09:42:36 compute-2 systemd-rc-local-generator[71636]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:42:36 compute-2 systemd-sysv-generator[71643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:42:36 compute-2 systemd[1]: Starting Ceph OSD losetup...
Oct 10 09:42:36 compute-2 bash[71648]: /dev/loop3: [64513]:4555204 (/var/lib/ceph-osd-0.img)
Oct 10 09:42:36 compute-2 systemd[1]: Finished Ceph OSD losetup.
Oct 10 09:42:36 compute-2 lvm[71649]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:42:36 compute-2 lvm[71649]: VG ceph_vg0 finished
Oct 10 09:42:36 compute-2 sudo[71605]: pam_unix(sudo:session): session closed for user root
Oct 10 09:42:38 compute-2 python3[71673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:42:56 compute-2 PackageKit[30964]: daemon quit
Oct 10 09:42:56 compute-2 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 09:43:24 compute-2 sshd-session[71718]: Invalid user admin from 2.57.121.112 port 31505
Oct 10 09:43:25 compute-2 sshd-session[71718]: Received disconnect from 2.57.121.112 port 31505:11: Bye [preauth]
Oct 10 09:43:25 compute-2 sshd-session[71718]: Disconnected from invalid user admin 2.57.121.112 port 31505 [preauth]
Oct 10 09:44:10 compute-2 sshd-session[71720]: Accepted publickey for ceph-admin from 192.168.122.100 port 41892 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:10 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Oct 10 09:44:10 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Oct 10 09:44:10 compute-2 systemd-logind[796]: New session 21 of user ceph-admin.
Oct 10 09:44:10 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Oct 10 09:44:10 compute-2 systemd[1]: Starting User Manager for UID 42477...
Oct 10 09:44:10 compute-2 systemd[71724]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:11 compute-2 systemd[71724]: Queued start job for default target Main User Target.
Oct 10 09:44:11 compute-2 systemd[71724]: Created slice User Application Slice.
Oct 10 09:44:11 compute-2 systemd[71724]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 09:44:11 compute-2 systemd[71724]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 09:44:11 compute-2 systemd[71724]: Reached target Paths.
Oct 10 09:44:11 compute-2 systemd[71724]: Reached target Timers.
Oct 10 09:44:11 compute-2 systemd[71724]: Starting D-Bus User Message Bus Socket...
Oct 10 09:44:11 compute-2 systemd[71724]: Starting Create User's Volatile Files and Directories...
Oct 10 09:44:11 compute-2 sshd-session[71739]: Accepted publickey for ceph-admin from 192.168.122.100 port 41906 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:11 compute-2 systemd[71724]: Finished Create User's Volatile Files and Directories.
Oct 10 09:44:11 compute-2 systemd[71724]: Listening on D-Bus User Message Bus Socket.
Oct 10 09:44:11 compute-2 systemd[71724]: Reached target Sockets.
Oct 10 09:44:11 compute-2 systemd[71724]: Reached target Basic System.
Oct 10 09:44:11 compute-2 systemd[71724]: Reached target Main User Target.
Oct 10 09:44:11 compute-2 systemd[71724]: Startup finished in 137ms.
Oct 10 09:44:11 compute-2 systemd[1]: Started User Manager for UID 42477.
Oct 10 09:44:11 compute-2 systemd[1]: Started Session 21 of User ceph-admin.
Oct 10 09:44:11 compute-2 sshd-session[71720]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:11 compute-2 systemd-logind[796]: New session 23 of user ceph-admin.
Oct 10 09:44:11 compute-2 systemd[1]: Started Session 23 of User ceph-admin.
Oct 10 09:44:11 compute-2 sshd-session[71739]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:11 compute-2 sudo[71747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:11 compute-2 sudo[71747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:11 compute-2 sudo[71747]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:11 compute-2 sshd-session[71772]: Accepted publickey for ceph-admin from 192.168.122.100 port 41914 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:11 compute-2 systemd-logind[796]: New session 24 of user ceph-admin.
Oct 10 09:44:11 compute-2 systemd[1]: Started Session 24 of User ceph-admin.
Oct 10 09:44:11 compute-2 sshd-session[71772]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:11 compute-2 sudo[71776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Oct 10 09:44:11 compute-2 sudo[71776]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:11 compute-2 sudo[71776]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:11 compute-2 sshd-session[71801]: Accepted publickey for ceph-admin from 192.168.122.100 port 41926 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:11 compute-2 systemd-logind[796]: New session 25 of user ceph-admin.
Oct 10 09:44:11 compute-2 systemd[1]: Started Session 25 of User ceph-admin.
Oct 10 09:44:11 compute-2 sshd-session[71801]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:12 compute-2 sudo[71805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Oct 10 09:44:12 compute-2 sudo[71805]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:12 compute-2 sudo[71805]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:12 compute-2 sshd-session[71830]: Accepted publickey for ceph-admin from 192.168.122.100 port 41936 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:12 compute-2 systemd-logind[796]: New session 26 of user ceph-admin.
Oct 10 09:44:12 compute-2 systemd[1]: Started Session 26 of User ceph-admin.
Oct 10 09:44:12 compute-2 sshd-session[71830]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:12 compute-2 sudo[71834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:44:12 compute-2 sudo[71834]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:12 compute-2 sudo[71834]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:12 compute-2 sshd-session[71859]: Accepted publickey for ceph-admin from 192.168.122.100 port 41940 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:12 compute-2 systemd-logind[796]: New session 27 of user ceph-admin.
Oct 10 09:44:12 compute-2 systemd[1]: Started Session 27 of User ceph-admin.
Oct 10 09:44:12 compute-2 sshd-session[71859]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:12 compute-2 sudo[71863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:44:12 compute-2 sudo[71863]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:12 compute-2 sudo[71863]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:13 compute-2 sshd-session[71888]: Accepted publickey for ceph-admin from 192.168.122.100 port 41956 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:13 compute-2 systemd-logind[796]: New session 28 of user ceph-admin.
Oct 10 09:44:13 compute-2 systemd[1]: Started Session 28 of User ceph-admin.
Oct 10 09:44:13 compute-2 sshd-session[71888]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:13 compute-2 sudo[71892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Oct 10 09:44:13 compute-2 sudo[71892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:13 compute-2 sudo[71892]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:13 compute-2 sshd-session[71917]: Accepted publickey for ceph-admin from 192.168.122.100 port 41962 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:13 compute-2 systemd-logind[796]: New session 29 of user ceph-admin.
Oct 10 09:44:13 compute-2 systemd[1]: Started Session 29 of User ceph-admin.
Oct 10 09:44:13 compute-2 sshd-session[71917]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:13 compute-2 sudo[71921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:44:13 compute-2 sudo[71921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:13 compute-2 sudo[71921]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:13 compute-2 sshd-session[71946]: Accepted publickey for ceph-admin from 192.168.122.100 port 41964 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:13 compute-2 systemd-logind[796]: New session 30 of user ceph-admin.
Oct 10 09:44:13 compute-2 systemd[1]: Started Session 30 of User ceph-admin.
Oct 10 09:44:13 compute-2 sshd-session[71946]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:13 compute-2 sudo[71950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Oct 10 09:44:13 compute-2 sudo[71950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:13 compute-2 sudo[71950]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:14 compute-2 sshd-session[71975]: Accepted publickey for ceph-admin from 192.168.122.100 port 41972 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:14 compute-2 systemd-logind[796]: New session 31 of user ceph-admin.
Oct 10 09:44:14 compute-2 systemd[1]: Started Session 31 of User ceph-admin.
Oct 10 09:44:14 compute-2 sshd-session[71975]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:15 compute-2 sshd-session[72002]: Accepted publickey for ceph-admin from 192.168.122.100 port 41984 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:15 compute-2 systemd-logind[796]: New session 32 of user ceph-admin.
Oct 10 09:44:15 compute-2 systemd[1]: Started Session 32 of User ceph-admin.
Oct 10 09:44:15 compute-2 sshd-session[72002]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:15 compute-2 sudo[72006]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Oct 10 09:44:15 compute-2 sudo[72006]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:15 compute-2 sudo[72006]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:15 compute-2 sshd-session[72031]: Accepted publickey for ceph-admin from 192.168.122.100 port 41986 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:44:15 compute-2 systemd-logind[796]: New session 33 of user ceph-admin.
Oct 10 09:44:15 compute-2 systemd[1]: Started Session 33 of User ceph-admin.
Oct 10 09:44:15 compute-2 sshd-session[72031]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:44:15 compute-2 sudo[72035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-2
Oct 10 09:44:15 compute-2 sudo[72035]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:16 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:16 compute-2 sudo[72035]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:54 compute-2 sudo[72082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:44:54 compute-2 sudo[72082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:54 compute-2 sudo[72082]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:54 compute-2 sudo[72107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:54 compute-2 sudo[72107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:54 compute-2 sudo[72107]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:54 compute-2 sudo[72132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Oct 10 09:44:54 compute-2 sudo[72132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:54 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:54 compute-2 sudo[72132]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:55 compute-2 sudo[72177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:55 compute-2 sudo[72177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:55 compute-2 sudo[72177]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:55 compute-2 sudo[72202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:44:55 compute-2 sudo[72202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:55 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:55 compute-2 sudo[72202]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:55 compute-2 sudo[72263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:55 compute-2 sudo[72263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:55 compute-2 sudo[72263]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:55 compute-2 sudo[72288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:44:55 compute-2 sudo[72288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:55 compute-2 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 72325 (sysctl)
Oct 10 09:44:55 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:55 compute-2 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 10 09:44:55 compute-2 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 10 09:44:56 compute-2 sudo[72288]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:56 compute-2 sudo[72347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:56 compute-2 sudo[72347]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:56 compute-2 sudo[72347]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:56 compute-2 sudo[72372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Oct 10 09:44:56 compute-2 sudo[72372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:56 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:56 compute-2 sudo[72372]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:56 compute-2 sudo[72415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:44:56 compute-2 sudo[72415]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:56 compute-2 sudo[72415]: pam_unix(sudo:session): session closed for user root
Oct 10 09:44:56 compute-2 sudo[72440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 -- inventory --format=json-pretty --filter-for-batch
Oct 10 09:44:56 compute-2 sudo[72440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:44:56 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:56 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:44:58 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat2331499709-merged.mount: Deactivated successfully.
Oct 10 09:44:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-compat2331499709-lower\x2dmapped.mount: Deactivated successfully.
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.458217925 +0000 UTC m=+16.482446633 container create 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Oct 10 09:45:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1332118266-merged.mount: Deactivated successfully.
Oct 10 09:45:13 compute-2 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 10 09:45:13 compute-2 systemd[1]: Started libpod-conmon-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope.
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.443390322 +0000 UTC m=+16.467619080 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:13 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.550137302 +0000 UTC m=+16.574366040 container init 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.557425935 +0000 UTC m=+16.581654653 container start 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True)
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.560501944 +0000 UTC m=+16.584730672 container attach 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:13 compute-2 nifty_knuth[72563]: 167 167
Oct 10 09:45:13 compute-2 systemd[1]: libpod-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope: Deactivated successfully.
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.563618823 +0000 UTC m=+16.587847541 container died 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Oct 10 09:45:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-b938549fcb43b9793c46950d25b4c88bafab5f35807fe92eca9a141e5222d6cf-merged.mount: Deactivated successfully.
Oct 10 09:45:13 compute-2 podman[72501]: 2025-10-10 09:45:13.595309725 +0000 UTC m=+16.619538453 container remove 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Oct 10 09:45:13 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:13 compute-2 systemd[1]: libpod-conmon-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope: Deactivated successfully.
Oct 10 09:45:13 compute-2 podman[72588]: 2025-10-10 09:45:13.76729893 +0000 UTC m=+0.037989025 container create 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 09:45:13 compute-2 systemd[1]: Started libpod-conmon-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope.
Oct 10 09:45:13 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:13 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:13 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:13 compute-2 podman[72588]: 2025-10-10 09:45:13.844902959 +0000 UTC m=+0.115593084 container init 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 09:45:13 compute-2 podman[72588]: 2025-10-10 09:45:13.750489773 +0000 UTC m=+0.021179878 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:13 compute-2 podman[72588]: 2025-10-10 09:45:13.856148118 +0000 UTC m=+0.126838203 container start 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Oct 10 09:45:13 compute-2 podman[72588]: 2025-10-10 09:45:13.860994703 +0000 UTC m=+0.131684838 container attach 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:14 compute-2 loving_carson[72605]: [
Oct 10 09:45:14 compute-2 loving_carson[72605]:     {
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "available": false,
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "being_replaced": false,
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "ceph_device_lvm": false,
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "lsm_data": {},
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "lvs": [],
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "path": "/dev/sr0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "rejected_reasons": [
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "Insufficient space (<5GB)",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "Has a FileSystem"
Oct 10 09:45:14 compute-2 loving_carson[72605]:         ],
Oct 10 09:45:14 compute-2 loving_carson[72605]:         "sys_api": {
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "actuators": null,
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "device_nodes": [
Oct 10 09:45:14 compute-2 loving_carson[72605]:                 "sr0"
Oct 10 09:45:14 compute-2 loving_carson[72605]:             ],
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "devname": "sr0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "human_readable_size": "482.00 KB",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "id_bus": "ata",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "model": "QEMU DVD-ROM",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "nr_requests": "2",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "parent": "/dev/sr0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "partitions": {},
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "path": "/dev/sr0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "removable": "1",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "rev": "2.5+",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "ro": "0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "rotational": "0",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "sas_address": "",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "sas_device_handle": "",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "scheduler_mode": "mq-deadline",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "sectors": 0,
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "sectorsize": "2048",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "size": 493568.0,
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "support_discard": "2048",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "type": "disk",
Oct 10 09:45:14 compute-2 loving_carson[72605]:             "vendor": "QEMU"
Oct 10 09:45:14 compute-2 loving_carson[72605]:         }
Oct 10 09:45:14 compute-2 loving_carson[72605]:     }
Oct 10 09:45:14 compute-2 loving_carson[72605]: ]
Oct 10 09:45:14 compute-2 systemd[1]: libpod-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope: Deactivated successfully.
Oct 10 09:45:14 compute-2 podman[73561]: 2025-10-10 09:45:14.608746322 +0000 UTC m=+0.022106257 container died 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 09:45:14 compute-2 systemd[1]: var-lib-containers-storage-overlay-a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c-merged.mount: Deactivated successfully.
Oct 10 09:45:14 compute-2 podman[73561]: 2025-10-10 09:45:14.640974532 +0000 UTC m=+0.054334467 container remove 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:14 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:14 compute-2 systemd[1]: libpod-conmon-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope: Deactivated successfully.
Oct 10 09:45:14 compute-2 sudo[72440]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:14 compute-2 sudo[73576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:45:14 compute-2 sudo[73576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:14 compute-2 sudo[73576]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:14 compute-2 sudo[73601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:45:14 compute-2 sudo[73601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:14 compute-2 sudo[73601]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:14 compute-2 sudo[73626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:14 compute-2 sudo[73626]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:14 compute-2 sudo[73626]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:14 compute-2 sudo[73651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:14 compute-2 sudo[73651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:14 compute-2 sudo[73651]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73676]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73724]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73724]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73749]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Oct 10 09:45:15 compute-2 sudo[73774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73774]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:15 compute-2 sudo[73799]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73799]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:15 compute-2 sudo[73824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73824]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73849]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:15 compute-2 sudo[73874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73874]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73899]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73947]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73947]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:15 compute-2 sudo[73972]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73972]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[73997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:15 compute-2 sudo[73997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[73997]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[74022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:45:15 compute-2 sudo[74022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[74022]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[74047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:45:15 compute-2 sudo[74047]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:15 compute-2 sudo[74047]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:15 compute-2 sudo[74072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74072]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74072]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:16 compute-2 sudo[74097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74097]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74122]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74170]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74195]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74195]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Oct 10 09:45:16 compute-2 sudo[74220]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74220]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:16 compute-2 sudo[74245]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74245]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:16 compute-2 sudo[74270]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74270]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74295]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:16 compute-2 sudo[74320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74320]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74345]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74393]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:45:16 compute-2 sudo[74418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74418]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:16 compute-2 sudo[74443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:45:16 compute-2 sudo[74443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:16 compute-2 sudo[74443]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:17 compute-2 sudo[74468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:17 compute-2 sudo[74468]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:17 compute-2 sudo[74468]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:17 compute-2 sudo[74497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:17 compute-2 sudo[74497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:17 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:17 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.498274215 +0000 UTC m=+0.021855909 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.630903493 +0000 UTC m=+0.154485187 container create df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:17 compute-2 systemd[1]: Started libpod-conmon-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope.
Oct 10 09:45:17 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.889085181 +0000 UTC m=+0.412666885 container init df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.895624279 +0000 UTC m=+0.419205973 container start df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.898992357 +0000 UTC m=+0.422574051 container attach df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True)
Oct 10 09:45:17 compute-2 zen_pare[74580]: 167 167
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.901736775 +0000 UTC m=+0.425318459 container died df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:17 compute-2 systemd[1]: libpod-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope: Deactivated successfully.
Oct 10 09:45:17 compute-2 podman[74564]: 2025-10-10 09:45:17.934405468 +0000 UTC m=+0.457987162 container remove df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Oct 10 09:45:17 compute-2 systemd[1]: libpod-conmon-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope: Deactivated successfully.
Oct 10 09:45:17 compute-2 podman[74598]: 2025-10-10 09:45:17.995947394 +0000 UTC m=+0.039071669 container create 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 09:45:18 compute-2 systemd[1]: Started libpod-conmon-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope.
Oct 10 09:45:18 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:18 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:18.055782756 +0000 UTC m=+0.098907061 container init 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:18.061636493 +0000 UTC m=+0.104760768 container start 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:18.06496161 +0000 UTC m=+0.108085905 container attach 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:17.978703134 +0000 UTC m=+0.021827439 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:18 compute-2 systemd[1]: libpod-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope: Deactivated successfully.
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:18.120413171 +0000 UTC m=+0.163537456 container died 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1)
Oct 10 09:45:18 compute-2 podman[74598]: 2025-10-10 09:45:18.15294748 +0000 UTC m=+0.196071755 container remove 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 10 09:45:18 compute-2 systemd[1]: libpod-conmon-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope: Deactivated successfully.
Oct 10 09:45:18 compute-2 systemd[1]: Reloading.
Oct 10 09:45:18 compute-2 systemd-rc-local-generator[74682]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:18 compute-2 systemd-sysv-generator[74685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:18 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:18 compute-2 systemd[1]: Reloading.
Oct 10 09:45:18 compute-2 systemd-sysv-generator[74722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:18 compute-2 systemd-rc-local-generator[74717]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:18 compute-2 systemd[1]: Reached target All Ceph clusters and services.
Oct 10 09:45:18 compute-2 systemd[1]: Reloading.
Oct 10 09:45:18 compute-2 systemd-sysv-generator[74761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:18 compute-2 systemd-rc-local-generator[74758]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:18 compute-2 systemd[1]: Reached target Ceph cluster 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:18 compute-2 systemd[1]: Reloading.
Oct 10 09:45:19 compute-2 systemd-rc-local-generator[74791]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:19 compute-2 systemd-sysv-generator[74796]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:19 compute-2 systemd[1]: Reloading.
Oct 10 09:45:19 compute-2 systemd-rc-local-generator[74834]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:19 compute-2 systemd-sysv-generator[74839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:19 compute-2 systemd[1]: Created slice Slice /system/ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:19 compute-2 systemd[1]: Reached target System Time Set.
Oct 10 09:45:19 compute-2 systemd[1]: Reached target System Time Synchronized.
Oct 10 09:45:19 compute-2 systemd[1]: Starting Ceph mon.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:45:19 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:19 compute-2 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 09:45:19 compute-2 podman[74893]: 2025-10-10 09:45:19.704777167 +0000 UTC m=+0.040439083 container create bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 09:45:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:19 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:19 compute-2 podman[74893]: 2025-10-10 09:45:19.773468472 +0000 UTC m=+0.109130388 container init bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:19 compute-2 podman[74893]: 2025-10-10 09:45:19.77871997 +0000 UTC m=+0.114381856 container start bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 09:45:19 compute-2 bash[74893]: bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd
Oct 10 09:45:19 compute-2 podman[74893]: 2025-10-10 09:45:19.686395971 +0000 UTC m=+0.022057877 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:19 compute-2 systemd[1]: Started Ceph mon.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:19 compute-2 ceph-mon[74913]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pidfile_write: ignore empty --pid-file
Oct 10 09:45:19 compute-2 ceph-mon[74913]: load: jerasure load: lrc 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: RocksDB version: 7.9.2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Git sha 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: DB SUMMARY
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: DB Session ID:  2V808MJHDIXUCLJZ1TSV
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: CURRENT file:  CURRENT
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                         Options.error_if_exists: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.create_if_missing: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                                     Options.env: 0x561619484c20
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                                      Options.fs: PosixFileSystem
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                                Options.info_log: 0x56161a93fa20
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                              Options.statistics: (nil)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                               Options.use_fsync: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                              Options.db_log_dir: 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                                 Options.wal_dir: 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                    Options.write_buffer_manager: 0x56161a943900
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.unordered_write: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                               Options.row_cache: None
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                              Options.wal_filter: None
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.two_write_queues: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.wal_compression: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.atomic_flush: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.max_background_jobs: 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.max_background_compactions: -1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.max_subcompactions: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.max_total_wal_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                          Options.max_open_files: -1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:       Options.compaction_readahead_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Compression algorithms supported:
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kZSTD supported: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kXpressCompression supported: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kBZip2Compression supported: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kZSTDNotFinalCompression supported: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kLZ4Compression supported: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kZlibCompression supported: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kLZ4HCCompression supported: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         kSnappyCompression supported: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:           Options.merge_operator: 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56161a93e5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x56161a963350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:        Options.write_buffer_size: 33554432
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:  Options.max_write_buffer_number: 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.compression: NoCompression
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c3989026-94dc-41dd-a555-ef3b3fd6f1b8
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519820949, "job": 1, "event": "recovery_started", "wal_files": [4]}
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519822758, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519822895, "job": 1, "event": "recovery_finished"}
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56161a964e00
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: DB pointer 0x56161aa6e000
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 09:45:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56161a963350#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Oct 10 09:45:19 compute-2 ceph-mon[74913]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Oct 10 09:45:19 compute-2 ceph-mon[74913]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:19 compute-2 sudo[74497]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(???) e0 preinit fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).mds e1 new map
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2025-10-10T09:43:15:731413+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 3314933000852226048, adjusting msgr requires
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon
                                           service_name: mon
                                           placement:
                                             hosts:
                                             - compute-0
                                             - compute-1
                                             - compute-2
                                           ''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr
                                           service_name: mgr
                                           placement:
                                             hosts:
                                             - compute-0
                                             - compute-1
                                             - compute-2
                                           ''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v24: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Deploying daemon crash.compute-1 on compute-1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4172963951' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c307f4a4-39e7-4a9c-9d19-a2b8712089ab"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4172963951' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c307f4a4-39e7-4a9c-9d19-a2b8712089ab"}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e4: 1 total, 0 up, 1 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/234960172' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aea3dcf0-efc7-4ff7-81f8-9509a806fb04"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/234960172' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "aea3dcf0-efc7-4ff7-81f8-9509a806fb04"}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e5: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2176337060' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1441666751' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v29: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v30: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v31: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Deploying daemon osd.0 on compute-0
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Deploying daemon osd.1 on compute-1
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v32: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/192005781' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v33: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v34: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v35: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e6: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e7: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: purged_snaps scrub starts
Oct 10 09:45:19 compute-2 ceph-mon[74913]: purged_snaps scrub ok
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v38: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e8: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: purged_snaps scrub starts
Oct 10 09:45:19 compute-2 ceph-mon[74913]: purged_snaps scrub ok
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e9: 2 total, 0 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v41: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to  5248M
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: OSD bench result of 8693.274022 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206] boot
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e10: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e11: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v44: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e12: 2 total, 1 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: OSD bench result of 2508.856277 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396] boot
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e13: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v47: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e14: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: osdmap e15: 2 total, 2 up, 2 in
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v50: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mgrmap e9: compute-0.xkdepb(active, since 87s)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v51: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v53: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:19 compute-2 ceph-mon[74913]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Deploying daemon mon.compute-2 on compute-2
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Oct 10 09:45:19 compute-2 ceph-mon[74913]: Cluster is now healthy
Oct 10 09:45:19 compute-2 ceph-mon[74913]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Oct 10 09:45:21 compute-2 ceph-mon[74913]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Oct 10 09:45:21 compute-2 ceph-mon[74913]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct 10 09:45:21 compute-2 ceph-mon[74913]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Oct 10 09:45:21 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:22 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 09:45:22 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Oct 10 09:45:24 compute-2 ceph-mon[74913]: pgmap v58: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1167870161' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Oct 10 09:45:24 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:24 compute-2 ceph-mon[74913]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025,kernel_version=5.14.0-621.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864356,os=Linux}
Oct 10 09:45:25 compute-2 ceph-mon[74913]: Deploying daemon mon.compute-1 on compute-1
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: mon.compute-0 calling monitor election
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: pgmap v59: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: mon.compute-2 calling monitor election
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:25 compute-2 ceph-mon[74913]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Oct 10 09:45:25 compute-2 ceph-mon[74913]: monmap epoch 2
Oct 10 09:45:25 compute-2 ceph-mon[74913]: fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:25 compute-2 ceph-mon[74913]: last_changed 2025-10-10T09:45:19.903599+0000
Oct 10 09:45:25 compute-2 ceph-mon[74913]: created 2025-10-10T09:43:13.233588+0000
Oct 10 09:45:25 compute-2 ceph-mon[74913]: min_mon_release 19 (squid)
Oct 10 09:45:25 compute-2 ceph-mon[74913]: election_strategy: 1
Oct 10 09:45:25 compute-2 ceph-mon[74913]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Oct 10 09:45:25 compute-2 ceph-mon[74913]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Oct 10 09:45:25 compute-2 ceph-mon[74913]: fsmap 
Oct 10 09:45:25 compute-2 ceph-mon[74913]: osdmap e15: 2 total, 2 up, 2 in
Oct 10 09:45:25 compute-2 ceph-mon[74913]: mgrmap e9: compute-0.xkdepb(active, since 106s)
Oct 10 09:45:25 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:25 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:25 compute-2 sudo[74952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:25 compute-2 sudo[74952]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:25 compute-2 sudo[74952]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:25 compute-2 sudo[74977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:25 compute-2 sudo[74977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.549977327 +0000 UTC m=+0.037306033 container create ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:25 compute-2 systemd[1]: Started libpod-conmon-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope.
Oct 10 09:45:25 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.532706575 +0000 UTC m=+0.020035281 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.633095332 +0000 UTC m=+0.120424048 container init ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.639993603 +0000 UTC m=+0.127322299 container start ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.644195066 +0000 UTC m=+0.131523762 container attach ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:25 compute-2 compassionate_galois[75058]: 167 167
Oct 10 09:45:25 compute-2 systemd[1]: libpod-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope: Deactivated successfully.
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.647301886 +0000 UTC m=+0.134630582 container died ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid)
Oct 10 09:45:25 compute-2 systemd[1]: var-lib-containers-storage-overlay-b1ce5ac7e351281de10c42fcb7fdce4ab56a5fdfe2beeb7c1660adf9d0967d1b-merged.mount: Deactivated successfully.
Oct 10 09:45:25 compute-2 podman[75042]: 2025-10-10 09:45:25.690953731 +0000 UTC m=+0.178282427 container remove ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1)
Oct 10 09:45:25 compute-2 systemd[1]: libpod-conmon-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope: Deactivated successfully.
Oct 10 09:45:25 compute-2 systemd[1]: Reloading.
Oct 10 09:45:25 compute-2 systemd-rc-local-generator[75099]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:25 compute-2 systemd-sysv-generator[75103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:25 compute-2 systemd[1]: Reloading.
Oct 10 09:45:26 compute-2 ceph-mon[74913]: pgmap v61: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:26 compute-2 ceph-mon[74913]: Deploying daemon mgr.compute-2.gkrssp on compute-2
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:26 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:26 compute-2 systemd-rc-local-generator[75140]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:26 compute-2 systemd-sysv-generator[75145]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 09:45:26 compute-2 ceph-mon[74913]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct 10 09:45:26 compute-2 ceph-mon[74913]: paxos.1).electionLogic(10) init, last seen epoch 10
Oct 10 09:45:26 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:26 compute-2 systemd[1]: Starting Ceph mgr.compute-2.gkrssp for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:45:26 compute-2 podman[75199]: 2025-10-10 09:45:26.462093766 +0000 UTC m=+0.045947218 container create 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Oct 10 09:45:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/lib/ceph/mgr/ceph-compute-2.gkrssp supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:26 compute-2 podman[75199]: 2025-10-10 09:45:26.5317034 +0000 UTC m=+0.115556862 container init 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:45:26 compute-2 podman[75199]: 2025-10-10 09:45:26.441567641 +0000 UTC m=+0.025421113 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:26 compute-2 podman[75199]: 2025-10-10 09:45:26.53949477 +0000 UTC m=+0.123348222 container start 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 09:45:26 compute-2 bash[75199]: 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4
Oct 10 09:45:26 compute-2 systemd[1]: Started Ceph mgr.compute-2.gkrssp for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:26 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:26 compute-2 sudo[74977]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:26 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:27 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:27 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:29 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:29 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:30 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:30 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 09:45:31 compute-2 ceph-mon[74913]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-0 calling monitor election
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-2 calling monitor election
Oct 10 09:45:31 compute-2 ceph-mon[74913]: pgmap v62: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-1 calling monitor election
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Oct 10 09:45:31 compute-2 ceph-mon[74913]: monmap epoch 3
Oct 10 09:45:31 compute-2 ceph-mon[74913]: fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:31 compute-2 ceph-mon[74913]: last_changed 2025-10-10T09:45:26.181993+0000
Oct 10 09:45:31 compute-2 ceph-mon[74913]: created 2025-10-10T09:43:13.233588+0000
Oct 10 09:45:31 compute-2 ceph-mon[74913]: min_mon_release 19 (squid)
Oct 10 09:45:31 compute-2 ceph-mon[74913]: election_strategy: 1
Oct 10 09:45:31 compute-2 ceph-mon[74913]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Oct 10 09:45:31 compute-2 ceph-mon[74913]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Oct 10 09:45:31 compute-2 ceph-mon[74913]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Oct 10 09:45:31 compute-2 ceph-mon[74913]: fsmap 
Oct 10 09:45:31 compute-2 ceph-mon[74913]: osdmap e15: 2 total, 2 up, 2 in
Oct 10 09:45:31 compute-2 ceph-mon[74913]: mgrmap e9: compute-0.xkdepb(active, since 112s)
Oct 10 09:45:31 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:31 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.rfugxc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 09:45:32 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.rfugxc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Oct 10 09:45:32 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 09:45:32 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:32 compute-2 ceph-mon[74913]: Deploying daemon mgr.compute-1.rfugxc on compute-1
Oct 10 09:45:32 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 09:45:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:32.755+0000 7f4b1c081140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:45:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 09:45:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:32.845+0000 7f4b1c081140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3667835426' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:33 compute-2 ceph-mon[74913]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 853 MiB used, 39 GiB / 40 GiB avail
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Oct 10 09:45:33 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e16 e16: 2 total, 2 up, 2 in
Oct 10 09:45:33 compute-2 sudo[75250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:33 compute-2 sudo[75250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:33 compute-2 sudo[75250]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:33 compute-2 sudo[75275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:33 compute-2 sudo[75275]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:33 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 09:45:33 compute-2 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:45:33 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 09:45:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:33.671+0000 7f4b1c081140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.835255371 +0000 UTC m=+0.045040780 container create b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Oct 10 09:45:33 compute-2 systemd[1]: Started libpod-conmon-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope.
Oct 10 09:45:33 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.815626434 +0000 UTC m=+0.025411863 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.92071138 +0000 UTC m=+0.130496779 container init b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.92916201 +0000 UTC m=+0.138947389 container start b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.932955752 +0000 UTC m=+0.142741141 container attach b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 09:45:33 compute-2 elated_dewdney[75360]: 167 167
Oct 10 09:45:33 compute-2 systemd[1]: libpod-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope: Deactivated successfully.
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.938046474 +0000 UTC m=+0.147831853 container died b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:33 compute-2 systemd[1]: var-lib-containers-storage-overlay-f5d969ae85a423e47d1305a6d8579b77154899f5c8c0e93b7c5fc1ca28b824b1-merged.mount: Deactivated successfully.
Oct 10 09:45:33 compute-2 podman[75342]: 2025-10-10 09:45:33.988904889 +0000 UTC m=+0.198690268 container remove b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 09:45:33 compute-2 systemd[1]: libpod-conmon-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope: Deactivated successfully.
Oct 10 09:45:34 compute-2 systemd[1]: Reloading.
Oct 10 09:45:34 compute-2 systemd-rc-local-generator[75404]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:34 compute-2 systemd-sysv-generator[75409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 09:45:34 compute-2 systemd[1]: Reloading.
Oct 10 09:45:34 compute-2 ceph-mon[74913]: Deploying daemon crash.compute-2 on compute-2
Oct 10 09:45:34 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3667835426' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:34 compute-2 ceph-mon[74913]: osdmap e16: 2 total, 2 up, 2 in
Oct 10 09:45:34 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3269086226' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.375+0000 7f4b1c081140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 systemd-rc-local-generator[75445]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:34 compute-2 systemd-sysv-generator[75448]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:   from numpy import show_config as show_numpy_config
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.550+0000 7f4b1c081140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 09:45:34 compute-2 systemd[1]: Starting Ceph crash.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.623+0000 7f4b1c081140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 09:45:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.760+0000 7f4b1c081140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:45:34 compute-2 podman[75504]: 2025-10-10 09:45:34.836410525 +0000 UTC m=+0.047845000 container create e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True)
Oct 10 09:45:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e17 _set_new_cache_sizes cache_size:1019932584 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:34 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:34 compute-2 podman[75504]: 2025-10-10 09:45:34.815292411 +0000 UTC m=+0.026726886 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:34 compute-2 podman[75504]: 2025-10-10 09:45:34.924728057 +0000 UTC m=+0.136162592 container init e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 09:45:34 compute-2 podman[75504]: 2025-10-10 09:45:34.931346728 +0000 UTC m=+0.142781203 container start e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:45:34 compute-2 bash[75504]: e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3
Oct 10 09:45:34 compute-2 systemd[1]: Started Ceph crash.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: INFO:ceph-crash:pinging cluster to exercise our key
Oct 10 09:45:35 compute-2 sudo[75275]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.108+0000 7fbc726b8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.108+0000 7fbc726b8640 -1 AuthRegistry(0x7fbc6c0696b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.109+0000 7fbc726b8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.109+0000 7fbc726b8640 -1 AuthRegistry(0x7fbc726b6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.110+0000 7fbc70c2e640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.111+0000 7fbc6b7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.112+0000 7fbc6bfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.112+0000 7fbc726b8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: [errno 13] RADOS permission denied (error connecting to the cluster)
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 09:45:35 compute-2 sudo[75526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:35 compute-2 sudo[75526]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:35 compute-2 sudo[75526]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 09:45:35 compute-2 sudo[75561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Oct 10 09:45:35 compute-2 sudo[75561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3269086226' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:35 compute-2 ceph-mon[74913]: osdmap e17: 2 total, 2 up, 2 in
Oct 10 09:45:35 compute-2 ceph-mon[74913]: pgmap v68: 3 pgs: 2 unknown, 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1727378227' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.650803222 +0000 UTC m=+0.043562882 container create 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Oct 10 09:45:35 compute-2 systemd[1]: Started libpod-conmon-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope.
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.632188918 +0000 UTC m=+0.024948618 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:35 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.748582036 +0000 UTC m=+0.141341736 container init 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.756345624 +0000 UTC m=+0.149105314 container start 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.761272312 +0000 UTC m=+0.154031982 container attach 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:35 compute-2 infallible_jones[75645]: 167 167
Oct 10 09:45:35 compute-2 systemd[1]: libpod-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope: Deactivated successfully.
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.763174253 +0000 UTC m=+0.155933913 container died 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:35 compute-2 systemd[1]: var-lib-containers-storage-overlay-fafe8ae54286663e98e3ac7b70335eae2f8c77dd16a7f5309b2faa66a02c505a-merged.mount: Deactivated successfully.
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:45:35 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 09:45:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:35.795+0000 7f4b1c081140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:45:35 compute-2 podman[75628]: 2025-10-10 09:45:35.798231342 +0000 UTC m=+0.190991002 container remove 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Oct 10 09:45:35 compute-2 systemd[1]: libpod-conmon-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope: Deactivated successfully.
Oct 10 09:45:35 compute-2 podman[75669]: 2025-10-10 09:45:35.969294307 +0000 UTC m=+0.045926557 container create df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:36 compute-2 systemd[1]: Started libpod-conmon-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope.
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.005+0000 7f4b1c081140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:36 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:36 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:36 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:36 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:36 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:36 compute-2 podman[75669]: 2025-10-10 09:45:35.952906274 +0000 UTC m=+0.029538544 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:36 compute-2 podman[75669]: 2025-10-10 09:45:36.058202208 +0000 UTC m=+0.134834478 container init df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:36 compute-2 podman[75669]: 2025-10-10 09:45:36.065382417 +0000 UTC m=+0.142014667 container start df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:36 compute-2 podman[75669]: 2025-10-10 09:45:36.069990064 +0000 UTC m=+0.146622314 container attach df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.089+0000 7f4b1c081140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.166+0000 7f4b1c081140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.252+0000 7f4b1c081140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.321+0000 7f4b1c081140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mon[74913]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 09:45:36 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1727378227' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:36 compute-2 ceph-mon[74913]: osdmap e18: 2 total, 2 up, 2 in
Oct 10 09:45:36 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:36 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1828731644' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: --> passed data devices: 0 physical, 1 LVM
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new fd47bcfa-dab9-466a-b4bb-0169e493040a
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.696+0000 7f4b1c081140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 09:45:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"} v 0)
Oct 10 09:45:36 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3277074974' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 09:45:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.817+0000 7f4b1c081140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:45:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e20 e20: 3 total, 2 up, 3 in
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Oct 10 09:45:36 compute-2 lvm[75747]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:45:36 compute-2 lvm[75747]: VG ceph_vg0 finished
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 09:45:36 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 09:45:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.262+0000 7f4b1c081140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1828731644' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:37 compute-2 ceph-mon[74913]: osdmap e19: 2 total, 2 up, 2 in
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3277074974' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]': finished
Oct 10 09:45:37 compute-2 ceph-mon[74913]: osdmap e20: 3 total, 2 up, 3 in
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:37 compute-2 ceph-mon[74913]: pgmap v72: 5 pgs: 4 unknown, 1 active+clean; 449 KiB data, 453 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:37 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3839621145' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:37 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Oct 10 09:45:37 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1014583551' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]:  stderr: got monmap epoch 3
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]: --> Creating keyring file for osd.2
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Oct 10 09:45:37 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid fd47bcfa-dab9-466a-b4bb-0169e493040a --setuser ceph --setgroup ceph
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 09:45:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.835+0000 7f4b1c081140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e21 e21: 3 total, 2 up, 3 in
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 09:45:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.907+0000 7f4b1c081140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:45:37 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 09:45:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.988+0000 7f4b1c081140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.139+0000 7f4b1c081140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.211+0000 7f4b1c081140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.358+0000 7f4b1c081140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 09:45:38 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1014583551' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct 10 09:45:38 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3839621145' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:38 compute-2 ceph-mon[74913]: osdmap e21: 3 total, 2 up, 3 in
Oct 10 09:45:38 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:38 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.573+0000 7f4b1c081140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.843+0000 7f4b1c081140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e22 e22: 3 total, 2 up, 3 in
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.920+0000 7f4b1c081140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:45:38 compute-2 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x55da60f76d00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2251912187' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2251912187' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 09:45:39 compute-2 ceph-mon[74913]: osdmap e22: 3 total, 2 up, 3 in
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:45:39 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp started
Oct 10 09:45:39 compute-2 ceph-mon[74913]: pgmap v75: 7 pgs: 2 active+clean, 5 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:39 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:39 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc started
Oct 10 09:45:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e22 _set_new_cache_sizes cache_size:1020053225 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e23 e23: 3 total, 2 up, 3 in
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1271642618' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1271642618' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Oct 10 09:45:40 compute-2 ceph-mon[74913]: osdmap e23: 3 total, 2 up, 3 in
Oct 10 09:45:40 compute-2 ceph-mon[74913]: mgrmap e10: compute-0.xkdepb(active, since 2m), standbys: compute-2.gkrssp, compute-1.rfugxc
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gkrssp", "id": "compute-2.gkrssp"}]: dispatch
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-1.rfugxc", "id": "compute-1.rfugxc"}]: dispatch
Oct 10 09:45:40 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:45:40 compute-2 vibrant_khorana[75686]:  stderr: 2025-10-10T09:45:37.570+0000 7f21506e6740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Oct 10 09:45:40 compute-2 vibrant_khorana[75686]:  stderr: 2025-10-10T09:45:37.837+0000 7f21506e6740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Oct 10 09:45:40 compute-2 vibrant_khorana[75686]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Oct 10 09:45:40 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:40 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 09:45:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 09:45:41 compute-2 vibrant_khorana[75686]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Oct 10 09:45:41 compute-2 systemd[1]: libpod-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Deactivated successfully.
Oct 10 09:45:41 compute-2 systemd[1]: libpod-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Consumed 1.964s CPU time.
Oct 10 09:45:41 compute-2 podman[75669]: 2025-10-10 09:45:41.250103746 +0000 UTC m=+5.326736086 container died df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:41 compute-2 systemd[1]: var-lib-containers-storage-overlay-cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b-merged.mount: Deactivated successfully.
Oct 10 09:45:41 compute-2 podman[75669]: 2025-10-10 09:45:41.306935802 +0000 UTC m=+5.383568062 container remove df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Oct 10 09:45:41 compute-2 systemd[1]: libpod-conmon-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Deactivated successfully.
Oct 10 09:45:41 compute-2 sudo[75561]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:41 compute-2 sudo[76676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:41 compute-2 sudo[76676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:41 compute-2 sudo[76676]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2550341542' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2550341542' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Oct 10 09:45:41 compute-2 ceph-mon[74913]: osdmap e24: 3 total, 2 up, 3 in
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:45:41 compute-2 ceph-mon[74913]: 2.1e scrub starts
Oct 10 09:45:41 compute-2 ceph-mon[74913]: pgmap v78: 38 pgs: 6 active+clean, 32 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:41 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:41 compute-2 ceph-mon[74913]: 2.1e scrub ok
Oct 10 09:45:41 compute-2 sudo[76701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 -- lvm list --format json
Oct 10 09:45:41 compute-2 sudo[76701]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.817122751 +0000 UTC m=+0.038708468 container create b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 09:45:41 compute-2 systemd[1]: Started libpod-conmon-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope.
Oct 10 09:45:41 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.79895823 +0000 UTC m=+0.020543967 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.896095973 +0000 UTC m=+0.117681740 container init b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.901799786 +0000 UTC m=+0.123385503 container start b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.904857764 +0000 UTC m=+0.126443481 container attach b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Oct 10 09:45:41 compute-2 kind_golick[76784]: 167 167
Oct 10 09:45:41 compute-2 systemd[1]: libpod-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope: Deactivated successfully.
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.9072689 +0000 UTC m=+0.128854647 container died b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:41 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Oct 10 09:45:41 compute-2 systemd[1]: var-lib-containers-storage-overlay-f408342537c5e6fdd036ccc0796c0a26f0667216386826a04c197b35d27c0187-merged.mount: Deactivated successfully.
Oct 10 09:45:41 compute-2 podman[76768]: 2025-10-10 09:45:41.948464077 +0000 UTC m=+0.170049794 container remove b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 09:45:41 compute-2 systemd[1]: libpod-conmon-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope: Deactivated successfully.
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.127876409 +0000 UTC m=+0.043013955 container create c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct 10 09:45:42 compute-2 systemd[1]: Started libpod-conmon-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope.
Oct 10 09:45:42 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:42 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.108093787 +0000 UTC m=+0.023231383 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.208649719 +0000 UTC m=+0.123787295 container init c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1)
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.222311206 +0000 UTC m=+0.137448772 container start c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.227378227 +0000 UTC m=+0.142515813 container attach c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1162723757' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Oct 10 09:45:42 compute-2 ceph-mon[74913]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1162723757' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Oct 10 09:45:42 compute-2 ceph-mon[74913]: osdmap e25: 3 total, 2 up, 3 in
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:42 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:45:42 compute-2 ceph-mon[74913]: 2.1d deep-scrub starts
Oct 10 09:45:42 compute-2 ceph-mon[74913]: 2.1d deep-scrub ok
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]: {
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:     "2": [
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:         {
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "devices": [
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "/dev/loop3"
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             ],
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "lv_name": "ceph_lv0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "lv_size": "21470642176",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=21f084a3-af34-5230-afe4-ea5cd24a55f4,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=fd47bcfa-dab9-466a-b4bb-0169e493040a,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "lv_uuid": "ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "name": "ceph_lv0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "path": "/dev/ceph_vg0/ceph_lv0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "tags": {
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.block_uuid": "ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.cephx_lockbox_secret": "",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.cluster_fsid": "21f084a3-af34-5230-afe4-ea5cd24a55f4",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.cluster_name": "ceph",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.crush_device_class": "",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.encrypted": "0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.osd_fsid": "fd47bcfa-dab9-466a-b4bb-0169e493040a",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.osd_id": "2",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.osdspec_affinity": "default_drive_group",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.type": "block",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.vdo": "0",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:                 "ceph.with_tpm": "0"
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             },
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "type": "block",
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:             "vg_name": "ceph_vg0"
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:         }
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]:     ]
Oct 10 09:45:42 compute-2 pedantic_ardinghelli[76825]: }
Oct 10 09:45:42 compute-2 systemd[1]: libpod-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope: Deactivated successfully.
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.505896136 +0000 UTC m=+0.421033792 container died c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:45:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3-merged.mount: Deactivated successfully.
Oct 10 09:45:42 compute-2 podman[76808]: 2025-10-10 09:45:42.559118426 +0000 UTC m=+0.474256032 container remove c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Oct 10 09:45:42 compute-2 systemd[1]: libpod-conmon-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope: Deactivated successfully.
Oct 10 09:45:42 compute-2 sudo[76701]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:42 compute-2 sudo[76846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:42 compute-2 sudo[76846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:42 compute-2 sudo[76846]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:42 compute-2 sudo[76871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:42 compute-2 sudo[76871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:42 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.288313872 +0000 UTC m=+0.056638251 container create 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:43 compute-2 systemd[1]: Started libpod-conmon-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope.
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.261491175 +0000 UTC m=+0.029815604 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:43 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.377678737 +0000 UTC m=+0.146003126 container init 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.387977785 +0000 UTC m=+0.156302144 container start 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid)
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.391548 +0000 UTC m=+0.159872399 container attach 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Oct 10 09:45:43 compute-2 sleepy_knuth[76952]: 167 167
Oct 10 09:45:43 compute-2 systemd[1]: libpod-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope: Deactivated successfully.
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.395875948 +0000 UTC m=+0.164200377 container died 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid)
Oct 10 09:45:43 compute-2 systemd[1]: var-lib-containers-storage-overlay-98e49d422ac95e2b5dedd5b14a202537dfd3b9926624b62d90bb03d57dbdbbd5-merged.mount: Deactivated successfully.
Oct 10 09:45:43 compute-2 podman[76936]: 2025-10-10 09:45:43.455583905 +0000 UTC m=+0.223908284 container remove 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:43 compute-2 ceph-mon[74913]: Deploying daemon osd.2 on compute-2
Oct 10 09:45:43 compute-2 ceph-mon[74913]: 2.1f scrub starts
Oct 10 09:45:43 compute-2 ceph-mon[74913]: 2.1f scrub ok
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/616535579' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:45:43 compute-2 ceph-mon[74913]: osdmap e26: 3 total, 2 up, 3 in
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:43 compute-2 ceph-mon[74913]: pgmap v81: 100 pgs: 38 active+clean, 62 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:43 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:43 compute-2 systemd[1]: libpod-conmon-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope: Deactivated successfully.
Oct 10 09:45:43 compute-2 podman[76982]: 2025-10-10 09:45:43.758396039 +0000 UTC m=+0.045607068 container create 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 09:45:43 compute-2 systemd[1]: Started libpod-conmon-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope.
Oct 10 09:45:43 compute-2 podman[76982]: 2025-10-10 09:45:43.739683962 +0000 UTC m=+0.026895021 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:43 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:43 compute-2 podman[76982]: 2025-10-10 09:45:43.878139665 +0000 UTC m=+0.165350744 container init 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Oct 10 09:45:43 compute-2 podman[76982]: 2025-10-10 09:45:43.887162583 +0000 UTC m=+0.174373602 container start 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:43 compute-2 podman[76982]: 2025-10-10 09:45:43.890999606 +0000 UTC m=+0.178210685 container attach 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:43 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Oct 10 09:45:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Oct 10 09:45:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]:                             [--no-systemd] [--no-tmpfs]
Oct 10 09:45:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]: ceph-volume activate: error: unrecognized arguments: --bad-option
Oct 10 09:45:44 compute-2 systemd[1]: libpod-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope: Deactivated successfully.
Oct 10 09:45:44 compute-2 podman[76982]: 2025-10-10 09:45:44.078078333 +0000 UTC m=+0.365289362 container died 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Oct 10 09:45:44 compute-2 systemd[1]: var-lib-containers-storage-overlay-604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40-merged.mount: Deactivated successfully.
Oct 10 09:45:44 compute-2 podman[76982]: 2025-10-10 09:45:44.125918981 +0000 UTC m=+0.413130050 container remove 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325)
Oct 10 09:45:44 compute-2 systemd[1]: libpod-conmon-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope: Deactivated successfully.
Oct 10 09:45:44 compute-2 systemd[1]: Reloading.
Oct 10 09:45:44 compute-2 systemd-rc-local-generator[77060]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:44 compute-2 systemd-sysv-generator[77063]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:44 compute-2 systemd[1]: Reloading.
Oct 10 09:45:44 compute-2 systemd-rc-local-generator[77097]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:45:44 compute-2 systemd-sysv-generator[77100]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:45:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:44 compute-2 systemd[1]: Starting Ceph osd.2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:45:44 compute-2 ceph-mon[74913]: 4.1e scrub starts
Oct 10 09:45:44 compute-2 ceph-mon[74913]: 4.1e scrub ok
Oct 10 09:45:44 compute-2 ceph-mon[74913]: 2.9 scrub starts
Oct 10 09:45:44 compute-2 ceph-mon[74913]: 2.9 scrub ok
Oct 10 09:45:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/616535579' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Oct 10 09:45:44 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:45:44 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:45:44 compute-2 ceph-mon[74913]: osdmap e27: 3 total, 2 up, 3 in
Oct 10 09:45:44 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Oct 10 09:45:45 compute-2 podman[77157]: 2025-10-10 09:45:45.108245704 +0000 UTC m=+0.036616411 container create a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:45 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:45 compute-2 podman[77157]: 2025-10-10 09:45:45.174355936 +0000 UTC m=+0.102726683 container init a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:45 compute-2 podman[77157]: 2025-10-10 09:45:45.185206082 +0000 UTC m=+0.113576769 container start a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 09:45:45 compute-2 podman[77157]: 2025-10-10 09:45:45.091882071 +0000 UTC m=+0.020252798 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:45 compute-2 podman[77157]: 2025-10-10 09:45:45.18858057 +0000 UTC m=+0.116951357 container attach a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Oct 10 09:45:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:45 compute-2 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:45 compute-2 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:45 compute-2 lvm[77253]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:45:45 compute-2 lvm[77253]: VG ceph_vg0 finished
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: --> Failed to activate via raw: did not find any matching OSD to activate
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:46 compute-2 bash[77157]: --> Failed to activate via raw: did not find any matching OSD to activate
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 09:45:46 compute-2 ceph-mon[74913]: 4.1f scrub starts
Oct 10 09:45:46 compute-2 ceph-mon[74913]: 4.1f scrub ok
Oct 10 09:45:46 compute-2 ceph-mon[74913]: 2.1c scrub starts
Oct 10 09:45:46 compute-2 ceph-mon[74913]: 2.1c scrub ok
Oct 10 09:45:46 compute-2 ceph-mon[74913]: pgmap v83: 162 pgs: 2 peering, 98 active+clean, 62 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2263940004' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Oct 10 09:45:46 compute-2 ceph-mon[74913]: osdmap e28: 3 total, 2 up, 3 in
Oct 10 09:45:46 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 09:45:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:46 compute-2 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 09:45:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 09:45:46 compute-2 bash[77157]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 09:45:46 compute-2 systemd[1]: libpod-a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e.scope: Deactivated successfully.
Oct 10 09:45:46 compute-2 systemd[1]: libpod-a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e.scope: Consumed 1.395s CPU time.
Oct 10 09:45:46 compute-2 podman[77157]: 2025-10-10 09:45:46.541525464 +0000 UTC m=+1.469896161 container died a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:45:46 compute-2 systemd[1]: var-lib-containers-storage-overlay-443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440-merged.mount: Deactivated successfully.
Oct 10 09:45:46 compute-2 podman[77157]: 2025-10-10 09:45:46.588356612 +0000 UTC m=+1.516727299 container remove a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:46 compute-2 podman[77404]: 2025-10-10 09:45:46.784332284 +0000 UTC m=+0.037232320 container create 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Oct 10 09:45:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:46 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:46 compute-2 podman[77404]: 2025-10-10 09:45:46.837650998 +0000 UTC m=+0.090551044 container init 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Oct 10 09:45:46 compute-2 podman[77404]: 2025-10-10 09:45:46.844582439 +0000 UTC m=+0.097482475 container start 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Oct 10 09:45:46 compute-2 bash[77404]: 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275
Oct 10 09:45:46 compute-2 podman[77404]: 2025-10-10 09:45:46.767854266 +0000 UTC m=+0.020754332 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:46 compute-2 systemd[1]: Started Ceph osd.2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:45:46 compute-2 ceph-osd[77423]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:45:46 compute-2 ceph-osd[77423]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Oct 10 09:45:46 compute-2 ceph-osd[77423]: pidfile_write: ignore empty --pid-file
Oct 10 09:45:46 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:46 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:46 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:46 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:46 compute-2 sudo[76871]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:46 compute-2 sudo[77435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:46 compute-2 sudo[77435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:46 compute-2 sudo[77435]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:47 compute-2 ceph-mon[74913]: 3.18 scrub starts
Oct 10 09:45:47 compute-2 ceph-mon[74913]: 3.18 scrub ok
Oct 10 09:45:47 compute-2 ceph-mon[74913]: 2.8 scrub starts
Oct 10 09:45:47 compute-2 ceph-mon[74913]: 2.8 scrub ok
Oct 10 09:45:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2263940004' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Oct 10 09:45:47 compute-2 ceph-mon[74913]: osdmap e29: 3 total, 2 up, 3 in
Oct 10 09:45:47 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:47 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:47 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:47 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:47 compute-2 sudo[77460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 -- raw list --format json
Oct 10 09:45:47 compute-2 sudo[77460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.40952556 +0000 UTC m=+0.040138408 container create f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:47 compute-2 systemd[1]: Started libpod-conmon-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope.
Oct 10 09:45:47 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.389765392 +0000 UTC m=+0.020378260 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.494319902 +0000 UTC m=+0.124932740 container init f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.500174777 +0000 UTC m=+0.130787605 container start f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.505063639 +0000 UTC m=+0.135676487 container attach f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Oct 10 09:45:47 compute-2 romantic_galileo[77549]: 167 167
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 09:45:47 compute-2 systemd[1]: libpod-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope: Deactivated successfully.
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.505968769 +0000 UTC m=+0.136581617 container died f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:45:47 compute-2 systemd[1]: var-lib-containers-storage-overlay-0e1ef57bd601e4c5f84541d5e40ced1ef613c37c332d977c2a646394b06cea23-merged.mount: Deactivated successfully.
Oct 10 09:45:47 compute-2 podman[77533]: 2025-10-10 09:45:47.553089127 +0000 UTC m=+0.183701945 container remove f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 09:45:47 compute-2 systemd[1]: libpod-conmon-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope: Deactivated successfully.
Oct 10 09:45:47 compute-2 podman[77576]: 2025-10-10 09:45:47.728900548 +0000 UTC m=+0.053723389 container create 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 09:45:47 compute-2 systemd[1]: Started libpod-conmon-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope.
Oct 10 09:45:47 compute-2 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:47 compute-2 podman[77576]: 2025-10-10 09:45:47.697296656 +0000 UTC m=+0.022119587 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:47 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:47 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:47 compute-2 podman[77576]: 2025-10-10 09:45:47.813760422 +0000 UTC m=+0.138583283 container init 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Oct 10 09:45:47 compute-2 podman[77576]: 2025-10-10 09:45:47.829024089 +0000 UTC m=+0.153846930 container start 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:47 compute-2 podman[77576]: 2025-10-10 09:45:47.833472008 +0000 UTC m=+0.158294869 container attach 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True)
Oct 10 09:45:48 compute-2 ceph-osd[77423]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Oct 10 09:45:48 compute-2 ceph-mon[74913]: 3.17 scrub starts
Oct 10 09:45:48 compute-2 ceph-mon[74913]: 3.17 scrub ok
Oct 10 09:45:48 compute-2 ceph-mon[74913]: 2.7 scrub starts
Oct 10 09:45:48 compute-2 ceph-osd[77423]: load: jerasure load: lrc 
Oct 10 09:45:48 compute-2 ceph-mon[74913]: 2.7 scrub ok
Oct 10 09:45:48 compute-2 ceph-mon[74913]: pgmap v86: 162 pgs: 2 peering, 98 active+clean, 62 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2169807361' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Oct 10 09:45:48 compute-2 ceph-mon[74913]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:48 compute-2 lvm[77676]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 09:45:48 compute-2 lvm[77676]: VG ceph_vg0 finished
Oct 10 09:45:48 compute-2 ceph-osd[77423]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Oct 10 09:45:48 compute-2 strange_solomon[77592]: {}
Oct 10 09:45:48 compute-2 ceph-osd[77423]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:48 compute-2 systemd[1]: libpod-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Deactivated successfully.
Oct 10 09:45:48 compute-2 podman[77576]: 2025-10-10 09:45:48.630039186 +0000 UTC m=+0.954862017 container died 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:45:48 compute-2 systemd[1]: libpod-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Consumed 1.241s CPU time.
Oct 10 09:45:48 compute-2 systemd[1]: var-lib-containers-storage-overlay-e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689-merged.mount: Deactivated successfully.
Oct 10 09:45:48 compute-2 podman[77576]: 2025-10-10 09:45:48.680997392 +0000 UTC m=+1.005820223 container remove 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:48 compute-2 systemd[1]: libpod-conmon-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Deactivated successfully.
Oct 10 09:45:48 compute-2 sudo[77460]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:48 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:48 compute-2 sudo[77704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:45:48 compute-2 sudo[77704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:48 compute-2 sudo[77704]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:49 compute-2 sudo[77729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:49 compute-2 sudo[77729]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:49 compute-2 ceph-mon[74913]: 4.10 scrub starts
Oct 10 09:45:49 compute-2 ceph-mon[74913]: 4.10 scrub ok
Oct 10 09:45:49 compute-2 ceph-mon[74913]: 2.a scrub starts
Oct 10 09:45:49 compute-2 ceph-mon[74913]: 2.a scrub ok
Oct 10 09:45:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2169807361' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Oct 10 09:45:49 compute-2 ceph-mon[74913]: osdmap e30: 3 total, 2 up, 3 in
Oct 10 09:45:49 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:49 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:49 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:49 compute-2 sudo[77729]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:49 compute-2 sudo[77754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:45:49 compute-2 sudo[77754]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount shared_bdev_used = 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: RocksDB version: 7.9.2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Git sha 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DB SUMMARY
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DB Session ID:  E13QT5JWED64DXY9YRGI
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: CURRENT file:  CURRENT
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.error_if_exists: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.create_if_missing: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                     Options.env: 0x55cb214cf650
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                Options.info_log: 0x55cb222ff6e0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.statistics: (nil)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.use_fsync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.db_log_dir: 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                 Options.wal_dir: db.wal
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.write_buffer_manager: 0x55cb223f2a00
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.unordered_write: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.row_cache: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.wal_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.two_write_queues: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.wal_compression: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.atomic_flush: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_background_jobs: 4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_background_compactions: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_subcompactions: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.max_open_files: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Compression algorithms supported:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZSTD supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kXpressCompression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kBZip2Compression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZSTDNotFinalCompression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kLZ4Compression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZlibCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kLZ4HCCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kSnappyCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2bae292-d520-4d17-8daf-8d5d2d3cbf01
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549452625, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549452919, "job": 1, "event": "recovery_finished"}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: freelist init
Oct 10 09:45:49 compute-2 ceph-osd[77423]: freelist _read_cfg
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs umount
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 09:45:49 compute-2 podman[78043]: 2025-10-10 09:45:49.701114719 +0000 UTC m=+0.055028282 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluefs mount shared_bdev_used = 4718592
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: RocksDB version: 7.9.2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Git sha 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DB SUMMARY
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DB Session ID:  E13QT5JWED64DXY9YRGJ
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: CURRENT file:  CURRENT
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.error_if_exists: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.create_if_missing: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                     Options.env: 0x55cb214cf110
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                Options.info_log: 0x55cb222ff860
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.statistics: (nil)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.use_fsync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.db_log_dir: 
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                                 Options.wal_dir: db.wal
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.write_buffer_manager: 0x55cb223f2a00
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.unordered_write: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.row_cache: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                              Options.wal_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.two_write_queues: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.wal_compression: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.atomic_flush: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_background_jobs: 4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_background_compactions: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_subcompactions: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.max_open_files: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Compression algorithms supported:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZSTD supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kXpressCompression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kBZip2Compression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZSTDNotFinalCompression supported: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kLZ4Compression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kZlibCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kLZ4HCCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         kSnappyCompression supported: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb21511350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cb215109b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2bae292-d520-4d17-8daf-8d5d2d3cbf01
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549720041, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549724435, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549727147, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549730214, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549731905, "job": 1, "event": "recovery_finished"}
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cb224a2000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: DB pointer 0x55cb2263e000
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Oct 10 09:45:49 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 09:45:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 4e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 09:45:49 compute-2 ceph-osd[77423]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Oct 10 09:45:49 compute-2 ceph-osd[77423]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Oct 10 09:45:49 compute-2 ceph-osd[77423]: _get_class not permitted to load lua
Oct 10 09:45:49 compute-2 ceph-osd[77423]: _get_class not permitted to load sdk
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 load_pgs
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 load_pgs opened 0 pgs
Oct 10 09:45:49 compute-2 ceph-osd[77423]: osd.2 0 log_to_monitors true
Oct 10 09:45:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:49.760+0000 7fcffd2e7740 -1 osd.2 0 log_to_monitors true
Oct 10 09:45:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Oct 10 09:45:49 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 09:45:49 compute-2 podman[78043]: 2025-10-10 09:45:49.830702462 +0000 UTC m=+0.184616005 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:45:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:50 compute-2 sudo[77754]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 3.16 deep-scrub starts
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 3.16 deep-scrub ok
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 2.4 scrub starts
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 2.4 scrub ok
Oct 10 09:45:50 compute-2 ceph-mon[74913]: pgmap v88: 162 pgs: 2 peering, 98 active+clean, 62 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:50 compute-2 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 09:45:50 compute-2 ceph-mon[74913]: Cluster is now healthy
Oct 10 09:45:50 compute-2 ceph-mon[74913]: from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 09:45:50 compute-2 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 2.1 scrub starts
Oct 10 09:45:50 compute-2 ceph-mon[74913]: 2.1 scrub ok
Oct 10 09:45:50 compute-2 sudo[78345]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:45:50 compute-2 sudo[78345]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:50 compute-2 sudo[78345]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:50 compute-2 sudo[78370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 -- inventory --format=json-pretty --filter-for-batch
Oct 10 09:45:50 compute-2 sudo[78370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.683094959 +0000 UTC m=+0.046854071 container create b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Oct 10 09:45:50 compute-2 systemd[1]: Started libpod-conmon-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope.
Oct 10 09:45:50 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.662544094 +0000 UTC m=+0.026303226 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.763593447 +0000 UTC m=+0.127352569 container init b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.775752102 +0000 UTC m=+0.139511214 container start b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Oct 10 09:45:50 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 09:45:50 compute-2 quizzical_herschel[78450]: 167 167
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.780414887 +0000 UTC m=+0.144174019 container attach b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.781013627 +0000 UTC m=+0.144772739 container died b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Oct 10 09:45:50 compute-2 systemd[1]: libpod-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope: Deactivated successfully.
Oct 10 09:45:50 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Oct 10 09:45:50 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Oct 10 09:45:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-d247c8f1c8b7a8af4d5813292218eac9436c073d5e9124d9019b6df28f87c77d-merged.mount: Deactivated successfully.
Oct 10 09:45:50 compute-2 podman[78434]: 2025-10-10 09:45:50.820769639 +0000 UTC m=+0.184528761 container remove b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 09:45:50 compute-2 systemd[1]: libpod-conmon-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope: Deactivated successfully.
Oct 10 09:45:50 compute-2 podman[78475]: 2025-10-10 09:45:50.976350718 +0000 UTC m=+0.045805816 container create 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:51 compute-2 systemd[1]: Started libpod-conmon-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope.
Oct 10 09:45:51 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:45:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:51 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:50.957164659 +0000 UTC m=+0.026619797 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:51.057774747 +0000 UTC m=+0.127229855 container init 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:51.063789658 +0000 UTC m=+0.133244746 container start 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:51.067440929 +0000 UTC m=+0.136896067 container attach 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:45:51 compute-2 ceph-mon[74913]: 4.11 deep-scrub starts
Oct 10 09:45:51 compute-2 ceph-mon[74913]: 4.11 deep-scrub ok
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Oct 10 09:45:51 compute-2 ceph-mon[74913]: osdmap e31: 3 total, 2 up, 3 in
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: 2.0 scrub starts
Oct 10 09:45:51 compute-2 ceph-mon[74913]: 2.0 scrub ok
Oct 10 09:45:51 compute-2 ceph-mon[74913]: pgmap v90: 162 pgs: 162 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:51 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]: [
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:     {
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "available": false,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "being_replaced": false,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "ceph_device_lvm": false,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "lsm_data": {},
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "lvs": [],
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "path": "/dev/sr0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "rejected_reasons": [
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "Insufficient space (<5GB)",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "Has a FileSystem"
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         ],
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         "sys_api": {
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "actuators": null,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "device_nodes": [
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:                 "sr0"
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             ],
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "devname": "sr0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "human_readable_size": "482.00 KB",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "id_bus": "ata",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "model": "QEMU DVD-ROM",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "nr_requests": "2",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "parent": "/dev/sr0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "partitions": {},
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "path": "/dev/sr0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "removable": "1",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "rev": "2.5+",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "ro": "0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "rotational": "0",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "sas_address": "",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "sas_device_handle": "",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "scheduler_mode": "mq-deadline",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "sectors": 0,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "sectorsize": "2048",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "size": 493568.0,
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "support_discard": "2048",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "type": "disk",
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:             "vendor": "QEMU"
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:         }
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]:     }
Oct 10 09:45:51 compute-2 zealous_chatterjee[78491]: ]
Oct 10 09:45:51 compute-2 systemd[1]: libpod-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope: Deactivated successfully.
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:51.725544179 +0000 UTC m=+0.794999267 container died 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct 10 09:45:51 compute-2 systemd[1]: var-lib-containers-storage-overlay-ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced-merged.mount: Deactivated successfully.
Oct 10 09:45:51 compute-2 podman[78475]: 2025-10-10 09:45:51.772534864 +0000 UTC m=+0.841989982 container remove 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:45:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 done with init, starting boot process
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 start_boot
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Oct 10 09:45:51 compute-2 ceph-osd[77423]: osd.2 0  bench count 12288000 bsize 4 KiB
Oct 10 09:45:51 compute-2 systemd[1]: libpod-conmon-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope: Deactivated successfully.
Oct 10 09:45:51 compute-2 sudo[78370]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:51 compute-2 sudo[79665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:45:51 compute-2 sudo[79665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:51 compute-2 sudo[79665]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:45:52 compute-2 sudo[79691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79691]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79716]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79716]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 ceph-mon[74913]: 3.15 deep-scrub starts
Oct 10 09:45:52 compute-2 ceph-mon[74913]: 3.15 deep-scrub ok
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2122384607' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2122384607' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:45:52 compute-2 ceph-mon[74913]: osdmap e32: 3 total, 2 up, 3 in
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:52 compute-2 ceph-mon[74913]: 2.2 deep-scrub starts
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 09:45:52 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:45:52 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 09:45:52 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 09:45:52 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 09:45:52 compute-2 ceph-mon[74913]: 2.2 deep-scrub ok
Oct 10 09:45:52 compute-2 sudo[79741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:52 compute-2 sudo[79741]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79741]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79766]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79766]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79766]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79814]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79814]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79839]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Oct 10 09:45:52 compute-2 sudo[79864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79864]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:52 compute-2 sudo[79889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79889]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:45:52 compute-2 sudo[79914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79914]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79939]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:45:52 compute-2 sudo[79964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79964]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[79989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[79989]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[79989]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Oct 10 09:45:52 compute-2 sudo[80037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[80037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[80037]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:52 compute-2 sudo[80062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:45:52 compute-2 sudo[80062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:52 compute-2 sudo[80062]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:53 compute-2 sudo[80087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:53 compute-2 sudo[80087]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:53 compute-2 sudo[80087]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:53 compute-2 ceph-mon[74913]: purged_snaps scrub starts
Oct 10 09:45:53 compute-2 ceph-mon[74913]: purged_snaps scrub ok
Oct 10 09:45:53 compute-2 ceph-mon[74913]: 4.12 scrub starts
Oct 10 09:45:53 compute-2 ceph-mon[74913]: 4.12 scrub ok
Oct 10 09:45:53 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:53 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2975567301' entity='client.admin' 
Oct 10 09:45:53 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:45:53 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:53 compute-2 ceph-mon[74913]: osdmap e33: 3 total, 2 up, 3 in
Oct 10 09:45:53 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:53 compute-2 ceph-mon[74913]: 2.3 deep-scrub starts
Oct 10 09:45:53 compute-2 ceph-mon[74913]: 2.3 deep-scrub ok
Oct 10 09:45:53 compute-2 ceph-mon[74913]: pgmap v93: 162 pgs: 44 peering, 118 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:53 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:53 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: 3.1f scrub starts
Oct 10 09:45:54 compute-2 ceph-mon[74913]: 3.1f scrub ok
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='client.14292 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:45:54 compute-2 ceph-mon[74913]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: Saving service ingress.rgw.default spec with placement count:2
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:54 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:54 compute-2 ceph-mon[74913]: 2.11 scrub starts
Oct 10 09:45:54 compute-2 ceph-mon[74913]: 2.11 scrub ok
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 35.622 iops: 9119.334 elapsed_sec: 0.329
Oct 10 09:45:54 compute-2 ceph-osd[77423]: log_channel(cluster) log [WRN] : OSD bench result of 9119.333889 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 0 waiting for initial osdmap
Oct 10 09:45:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:54.216+0000 7fcff926a640 -1 osd.2 0 waiting for initial osdmap
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 crush map has features 288514051259236352, adjusting msgr requires for clients
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 crush map has features 3314933000852226048, adjusting msgr requires for osds
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 check_osdmap_features require_osd_release unknown -> squid
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct 10 09:45:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:54.241+0000 7fcff4892640 -1 osd.2 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 set_numa_affinity not setting numa affinity
Oct 10 09:45:54 compute-2 ceph-osd[77423]: osd.2 33 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Oct 10 09:45:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 33 tick checking mon for new map
Oct 10 09:45:55 compute-2 ceph-mon[74913]: 5.19 scrub starts
Oct 10 09:45:55 compute-2 ceph-mon[74913]: 5.19 scrub ok
Oct 10 09:45:55 compute-2 ceph-mon[74913]: OSD bench result of 9119.333889 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 09:45:55 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:55 compute-2 ceph-mon[74913]: 2.14 scrub starts
Oct 10 09:45:55 compute-2 ceph-mon[74913]: 2.14 scrub ok
Oct 10 09:45:55 compute-2 ceph-mon[74913]: pgmap v94: 162 pgs: 44 peering, 118 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Oct 10 09:45:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 34 state: booting -> active
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.19( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.1a( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1b( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1c( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1a( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1e( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.9( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.e( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1d( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.8( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1d( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.3( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.6( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.4( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.2( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.d( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.b( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.e( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.9( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.8( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.11( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.8( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.15( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.14( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.17( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.12( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.12( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.15( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.13( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1f( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1c( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1b( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:56 compute-2 ceph-mon[74913]: 3.1e scrub starts
Oct 10 09:45:56 compute-2 ceph-mon[74913]: 3.1e scrub ok
Oct 10 09:45:56 compute-2 ceph-mon[74913]: osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354] boot
Oct 10 09:45:56 compute-2 ceph-mon[74913]: osdmap e34: 3 total, 3 up, 3 in
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:45:56 compute-2 ceph-mon[74913]: Saving service node-exporter spec with placement *
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:56 compute-2 ceph-mon[74913]: Saving service grafana spec with placement compute-0;count:1
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:56 compute-2 ceph-mon[74913]: Saving service prometheus spec with placement compute-0;count:1
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:56 compute-2 ceph-mon[74913]: Saving service alertmanager spec with placement compute-0;count:1
Oct 10 09:45:56 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:56 compute-2 ceph-mon[74913]: 2.16 scrub starts
Oct 10 09:45:56 compute-2 ceph-mon[74913]: 2.16 scrub ok
Oct 10 09:45:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.17( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.8( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.14( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.e( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.11( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.b( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.d( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.8( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.9( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.0( empty local-lis/les=34/35 n=0 ec=17/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.2( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.0( empty local-lis/les=34/35 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.19( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1b( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.1a( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1b( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.6( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1e( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.e( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.8( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1c( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1d( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1a( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.15( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1d( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.3( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.9( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.4( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.12( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.12( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.15( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1c( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1f( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.13( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:56 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Oct 10 09:45:56 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Oct 10 09:45:57 compute-2 ceph-mon[74913]: 6.18 scrub starts
Oct 10 09:45:57 compute-2 ceph-mon[74913]: 6.18 scrub ok
Oct 10 09:45:57 compute-2 ceph-mon[74913]: osdmap e35: 3 total, 3 up, 3 in
Oct 10 09:45:57 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:57 compute-2 ceph-mon[74913]: 2.17 deep-scrub starts
Oct 10 09:45:57 compute-2 ceph-mon[74913]: 2.17 deep-scrub ok
Oct 10 09:45:57 compute-2 ceph-mon[74913]: pgmap v97: 162 pgs: 80 peering, 82 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:45:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2898111592' entity='client.admin' 
Oct 10 09:45:57 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Oct 10 09:45:57 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.15( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.13( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.18( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.12( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.f( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.10( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.d( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.c( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.b( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.5( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.a( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1d( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1c( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1b( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.13( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.18( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.12( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.15( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.10( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.f( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.c( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.5( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.d( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.b( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1c( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1d( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1b( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.a( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:45:58 compute-2 sudo[80113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:45:58 compute-2 sudo[80113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 5.1d scrub starts
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 5.1d scrub ok
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 6.17 scrub starts
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 6.17 scrub ok
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 2.1a scrub starts
Oct 10 09:45:58 compute-2 sudo[80113]: pam_unix(sudo:session): session closed for user root
Oct 10 09:45:58 compute-2 ceph-mon[74913]: 2.1a scrub ok
Oct 10 09:45:58 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:58 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1237849469' entity='client.admin' 
Oct 10 09:45:58 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.e scrub starts
Oct 10 09:45:58 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.e scrub ok
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 6.1f scrub starts
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 6.1f scrub ok
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 4.14 scrub starts
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 4.14 scrub ok
Oct 10 09:45:59 compute-2 ceph-mon[74913]: Reconfiguring mon.compute-0 (monmap changed)...
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: Reconfiguring daemon mon.compute-0 on compute-0
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 5.1f scrub starts
Oct 10 09:45:59 compute-2 ceph-mon[74913]: 5.1f scrub ok
Oct 10 09:45:59 compute-2 ceph-mon[74913]: pgmap v98: 162 pgs: 36 peering, 126 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:45:59 compute-2 ceph-mon[74913]: Reconfiguring mgr.compute-0.xkdepb (monmap changed)...
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.xkdepb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:45:59 compute-2 ceph-mon[74913]: Reconfiguring daemon mgr.compute-0.xkdepb on compute-0
Oct 10 09:45:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3410162506' entity='client.admin' 
Oct 10 09:45:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:45:59 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Oct 10 09:45:59 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 6.c scrub starts
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 6.c scrub ok
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 3.e scrub starts
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 3.e scrub ok
Oct 10 09:46:00 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:00 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:00 compute-2 ceph-mon[74913]: Reconfiguring crash.compute-0 (monmap changed)...
Oct 10 09:46:00 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 09:46:00 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:00 compute-2 ceph-mon[74913]: Reconfiguring daemon crash.compute-0 on compute-0
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 5.10 scrub starts
Oct 10 09:46:00 compute-2 ceph-mon[74913]: 5.10 scrub ok
Oct 10 09:46:00 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Oct 10 09:46:00 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 4.f scrub starts
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 4.f scrub ok
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 5.8 scrub starts
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 5.8 scrub ok
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:01 compute-2 ceph-mon[74913]: Reconfiguring osd.0 (monmap changed)...
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:01 compute-2 ceph-mon[74913]: Reconfiguring daemon osd.0 on compute-0
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 5.11 scrub starts
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2517476288' entity='client.admin' 
Oct 10 09:46:01 compute-2 ceph-mon[74913]: 5.11 scrub ok
Oct 10 09:46:01 compute-2 ceph-mon[74913]: pgmap v99: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 09:46:01 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:01 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.b scrub starts
Oct 10 09:46:01 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.b scrub ok
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 3.4 scrub starts
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 3.4 scrub ok
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 3.11 scrub starts
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 3.11 scrub ok
Oct 10 09:46:02 compute-2 ceph-mon[74913]: Reconfiguring crash.compute-1 (monmap changed)...
Oct 10 09:46:02 compute-2 ceph-mon[74913]: Reconfiguring daemon crash.compute-1 on compute-1
Oct 10 09:46:02 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 5.15 scrub starts
Oct 10 09:46:02 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:02 compute-2 ceph-mon[74913]: Reconfiguring osd.1 (monmap changed)...
Oct 10 09:46:02 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct 10 09:46:02 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:02 compute-2 ceph-mon[74913]: Reconfiguring daemon osd.1 on compute-1
Oct 10 09:46:02 compute-2 ceph-mon[74913]: 5.15 scrub ok
Oct 10 09:46:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/compute-1.rfugxc/server_addr}] v 0)
Oct 10 09:46:02 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.d scrub starts
Oct 10 09:46:02 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.d scrub ok
Oct 10 09:46:02 compute-2 sudo[80161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lexqljkghiaptjiegyfnsiqyuezaiajz ; /usr/bin/python3'
Oct 10 09:46:02 compute-2 sudo[80161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:46:03 compute-2 python3[80163]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:46:03 compute-2 sudo[80161]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 4.4 scrub starts
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 4.4 scrub ok
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 5.b scrub starts
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 5.b scrub ok
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' 
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:03 compute-2 ceph-mon[74913]: Reconfiguring mon.compute-1 (monmap changed)...
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Oct 10 09:46:03 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:03 compute-2 ceph-mon[74913]: Reconfiguring daemon mon.compute-1 on compute-1
Oct 10 09:46:03 compute-2 ceph-mon[74913]: pgmap v100: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 3.14 scrub starts
Oct 10 09:46:03 compute-2 ceph-mon[74913]: 3.14 scrub ok
Oct 10 09:46:03 compute-2 sudo[80177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:03 compute-2 sudo[80177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:03 compute-2 sudo[80177]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:03 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Oct 10 09:46:03 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Oct 10 09:46:03 compute-2 sudo[80202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:03 compute-2 sudo[80202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.151213714 +0000 UTC m=+0.043412166 container create 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Oct 10 09:46:04 compute-2 systemd[1]: Started libpod-conmon-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope.
Oct 10 09:46:04 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.132215192 +0000 UTC m=+0.024413664 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.230520164 +0000 UTC m=+0.122718646 container init 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.2367348 +0000 UTC m=+0.128933252 container start 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.240122783 +0000 UTC m=+0.132321235 container attach 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Oct 10 09:46:04 compute-2 sharp_snyder[80258]: 167 167
Oct 10 09:46:04 compute-2 systemd[1]: libpod-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope: Deactivated successfully.
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.242044806 +0000 UTC m=+0.134243268 container died 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:46:04 compute-2 systemd[1]: var-lib-containers-storage-overlay-c48fc53ebcfe2d7a68661659fca09026f77cc45ab364dd0da4eeb878447a5794-merged.mount: Deactivated successfully.
Oct 10 09:46:04 compute-2 podman[80242]: 2025-10-10 09:46:04.278104697 +0000 UTC m=+0.170303149 container remove 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:46:04 compute-2 systemd[1]: libpod-conmon-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope: Deactivated successfully.
Oct 10 09:46:04 compute-2 sudo[80202]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:04 compute-2 sudo[80275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:04 compute-2 sudo[80275]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:04 compute-2 sudo[80275]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.5 scrub starts
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.5 scrub ok
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.d scrub starts
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.d scrub ok
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:04 compute-2 ceph-mon[74913]: Reconfiguring mon.compute-2 (monmap changed)...
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:04 compute-2 ceph-mon[74913]: Reconfiguring daemon mon.compute-2 on compute-2
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.16 scrub starts
Oct 10 09:46:04 compute-2 ceph-mon[74913]: 5.16 scrub ok
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/699590867' entity='client.admin' 
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 09:46:04 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:04 compute-2 sudo[80300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:04 compute-2 sudo[80300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.724421669 +0000 UTC m=+0.035054177 container create 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 09:46:04 compute-2 systemd[1]: Started libpod-conmon-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope.
Oct 10 09:46:04 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.797022615 +0000 UTC m=+0.107655143 container init 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.80376459 +0000 UTC m=+0.114397098 container start 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.70911696 +0000 UTC m=+0.019749488 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.807382581 +0000 UTC m=+0.118015119 container attach 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:46:04 compute-2 nostalgic_heyrovsky[80359]: 167 167
Oct 10 09:46:04 compute-2 systemd[1]: libpod-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope: Deactivated successfully.
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.812428158 +0000 UTC m=+0.123060676 container died 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 09:46:04 compute-2 systemd[1]: var-lib-containers-storage-overlay-41cc191df743073c417e0eb5be5c6f6640f1aee3d02466fd2ff204cb5e998fe6-merged.mount: Deactivated successfully.
Oct 10 09:46:04 compute-2 podman[80343]: 2025-10-10 09:46:04.844167864 +0000 UTC m=+0.154800372 container remove 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Oct 10 09:46:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:04 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Oct 10 09:46:04 compute-2 systemd[1]: libpod-conmon-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope: Deactivated successfully.
Oct 10 09:46:04 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Oct 10 09:46:04 compute-2 sudo[80300]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:04 compute-2 sudo[80375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:04 compute-2 sudo[80375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:04 compute-2 sudo[80375]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:05 compute-2 sudo[80400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:46:05 compute-2 sudo[80400]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 6.6 scrub starts
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 6.6 scrub ok
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 4.8 scrub starts
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 4.8 scrub ok
Oct 10 09:46:05 compute-2 ceph-mon[74913]: Reconfiguring mgr.compute-2.gkrssp (monmap changed)...
Oct 10 09:46:05 compute-2 ceph-mon[74913]: Reconfiguring daemon mgr.compute-2.gkrssp on compute-2
Oct 10 09:46:05 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:05 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:05 compute-2 ceph-mon[74913]: pgmap v101: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 4.13 scrub starts
Oct 10 09:46:05 compute-2 ceph-mon[74913]: 4.13 scrub ok
Oct 10 09:46:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1171706134' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Oct 10 09:46:05 compute-2 podman[80497]: 2025-10-10 09:46:05.521096742 +0000 UTC m=+0.053004725 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Oct 10 09:46:05 compute-2 podman[80497]: 2025-10-10 09:46:05.609128011 +0000 UTC m=+0.141035964 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:46:05 compute-2 sudo[80400]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:05 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Oct 10 09:46:05 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Oct 10 09:46:05 compute-2 sudo[80584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:05 compute-2 sudo[80584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:05 compute-2 sudo[80584]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:06 compute-2 sudo[80609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:46:06 compute-2 sudo[80609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:06 compute-2 sudo[80609]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 3.2 scrub starts
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 3.2 scrub ok
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 4.9 scrub starts
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 4.9 scrub ok
Oct 10 09:46:06 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:06 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 3.c scrub starts
Oct 10 09:46:06 compute-2 ceph-mon[74913]: 3.c scrub ok
Oct 10 09:46:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1171706134' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Oct 10 09:46:06 compute-2 ceph-mon[74913]: mgrmap e11: compute-0.xkdepb(active, since 2m), standbys: compute-2.gkrssp, compute-1.rfugxc
Oct 10 09:46:06 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:06 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:06 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Oct 10 09:46:06 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.1 deep-scrub starts
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.1 deep-scrub ok
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.0 scrub starts
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.0 scrub ok
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mon[74913]: pgmap v102: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.f deep-scrub starts
Oct 10 09:46:07 compute-2 ceph-mon[74913]: 3.f deep-scrub ok
Oct 10 09:46:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/520827948' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr respawn  exe_path /proc/self/exe
Oct 10 09:46:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 09:46:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 09:46:07 compute-2 sshd-session[72034]: Connection closed by 192.168.122.100 port 41986
Oct 10 09:46:07 compute-2 sshd-session[71804]: Connection closed by 192.168.122.100 port 41926
Oct 10 09:46:07 compute-2 sshd-session[71949]: Connection closed by 192.168.122.100 port 41964
Oct 10 09:46:07 compute-2 sshd-session[71978]: Connection closed by 192.168.122.100 port 41972
Oct 10 09:46:07 compute-2 sshd-session[71745]: Connection closed by 192.168.122.100 port 41892
Oct 10 09:46:07 compute-2 sshd-session[72005]: Connection closed by 192.168.122.100 port 41984
Oct 10 09:46:07 compute-2 sshd-session[71891]: Connection closed by 192.168.122.100 port 41956
Oct 10 09:46:07 compute-2 sshd-session[71775]: Connection closed by 192.168.122.100 port 41914
Oct 10 09:46:07 compute-2 sshd-session[71862]: Connection closed by 192.168.122.100 port 41940
Oct 10 09:46:07 compute-2 sshd-session[71746]: Connection closed by 192.168.122.100 port 41906
Oct 10 09:46:07 compute-2 sshd-session[71920]: Connection closed by 192.168.122.100 port 41962
Oct 10 09:46:07 compute-2 sshd-session[71833]: Connection closed by 192.168.122.100 port 41936
Oct 10 09:46:07 compute-2 sshd-session[71859]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 sshd-session[71946]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 sshd-session[72002]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 sshd-session[71801]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 sshd-session[71772]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 sshd-session[71830]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-27.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 sshd-session[71720]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 27 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 sshd-session[71917]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-32.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 sshd-session[71739]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-26.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 sshd-session[71888]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-25.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 sshd-session[71975]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-21.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd[1]: session-29.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd[1]: session-23.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd[1]: session-28.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 sshd-session[72031]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:07 compute-2 systemd[1]: session-31.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 32 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd[1]: session-30.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 26 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 21 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd[1]: session-24.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 23 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd[1]: session-33.scope: Deactivated successfully.
Oct 10 09:46:07 compute-2 systemd[1]: session-33.scope: Consumed 1min 1.779s CPU time.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 29 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 28 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 25 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 31 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 30 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 24 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Session 33 logged out. Waiting for processes to exit.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 27.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 32.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 26.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 25.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 21.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 29.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 23.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 28.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 31.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 30.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 24.
Oct 10 09:46:07 compute-2 systemd-logind[796]: Removed session 33.
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 09:46:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:07.759+0000 7f3d49c77140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:07.834+0000 7f3d49c77140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:07 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 09:46:07 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Oct 10 09:46:07 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 6.4 scrub starts
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 6.4 scrub ok
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 5.0 scrub starts
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 5.0 scrub ok
Oct 10 09:46:08 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/520827948' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Oct 10 09:46:08 compute-2 ceph-mon[74913]: mgrmap e12: compute-0.xkdepb(active, since 2m), standbys: compute-2.gkrssp, compute-1.rfugxc
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 3.d scrub starts
Oct 10 09:46:08 compute-2 ceph-mon[74913]: 3.d scrub ok
Oct 10 09:46:08 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 09:46:08 compute-2 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:08.591+0000 7f3d49c77140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:08 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 09:46:08 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Oct 10 09:46:08 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.222+0000 7f3d49c77140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:   from numpy import show_config as show_numpy_config
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.385+0000 7f3d49c77140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.460+0000 7f3d49c77140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 6.0 scrub starts
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 6.0 scrub ok
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 4.2 scrub starts
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 4.2 scrub ok
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 5.3 scrub starts
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 5.3 scrub ok
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 5.9 scrub starts
Oct 10 09:46:09 compute-2 ceph-mon[74913]: 5.9 scrub ok
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 09:46:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.597+0000 7f3d49c77140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:09 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Oct 10 09:46:09 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Oct 10 09:46:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 4.19 scrub starts
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 4.19 scrub ok
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 3.6 scrub starts
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 3.6 scrub ok
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 3.10 scrub starts
Oct 10 09:46:10 compute-2 ceph-mon[74913]: 3.10 scrub ok
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.570+0000 7f3d49c77140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 09:46:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.787+0000 7f3d49c77140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Oct 10 09:46:10 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.865+0000 7f3d49c77140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.927+0000 7f3d49c77140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.001+0000 7f3d49c77140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.079+0000 7f3d49c77140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.433+0000 7f3d49c77140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 5.1a scrub starts
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 5.1a scrub ok
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 3.7 scrub starts
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 3.7 scrub ok
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 3.13 scrub starts
Oct 10 09:46:11 compute-2 ceph-mon[74913]: 3.13 scrub ok
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.532+0000 7f3d49c77140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 09:46:11 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Oct 10 09:46:11 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.965+0000 7f3d49c77140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 4.1 scrub starts
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 4.1 scrub ok
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 4.0 scrub starts
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 4.0 scrub ok
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 6.a scrub starts
Oct 10 09:46:12 compute-2 ceph-mon[74913]: 6.a scrub ok
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.535+0000 7f3d49c77140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.607+0000 7f3d49c77140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.686+0000 7f3d49c77140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.834+0000 7f3d49c77140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 09:46:12 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Oct 10 09:46:12 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.906+0000 7f3d49c77140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.058+0000 7f3d49c77140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.275+0000 7f3d49c77140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 6.1b scrub starts
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 6.1b scrub ok
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 4.7 scrub starts
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 4.7 scrub ok
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 6.8 scrub starts
Oct 10 09:46:13 compute-2 ceph-mon[74913]: 6.8 scrub ok
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.551+0000 7f3d49c77140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.648+0000 7f3d49c77140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56304770f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 09:46:13 compute-2 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 09:46:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Oct 10 09:46:13 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Oct 10 09:46:13 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Oct 10 09:46:14 compute-2 sshd-session[80707]: Accepted publickey for ceph-admin from 192.168.122.100 port 44966 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:46:14 compute-2 systemd-logind[796]: New session 34 of user ceph-admin.
Oct 10 09:46:14 compute-2 systemd[1]: Started Session 34 of User ceph-admin.
Oct 10 09:46:14 compute-2 sshd-session[80707]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:46:14 compute-2 ceph-mon[74913]: 4.6 scrub starts
Oct 10 09:46:14 compute-2 ceph-mon[74913]: 4.6 scrub ok
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc restarted
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc started
Oct 10 09:46:14 compute-2 ceph-mon[74913]: 5.6 scrub starts
Oct 10 09:46:14 compute-2 ceph-mon[74913]: 5.6 scrub ok
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp restarted
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp started
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 09:46:14 compute-2 ceph-mon[74913]: osdmap e36: 3 total, 3 up, 3 in
Oct 10 09:46:14 compute-2 ceph-mon[74913]: mgrmap e13: compute-0.xkdepb(active, starting, since 0.0350206s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-0.xkdepb", "id": "compute-0.xkdepb"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-1.rfugxc", "id": "compute-1.rfugxc"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gkrssp", "id": "compute-2.gkrssp"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 09:46:14 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 09:46:14 compute-2 sudo[80711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:14 compute-2 sudo[80711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:14 compute-2 sudo[80711]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:14 compute-2 sudo[80736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:46:14 compute-2 sudo[80736]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts
Oct 10 09:46:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok
Oct 10 09:46:15 compute-2 podman[80833]: 2025-10-10 09:46:15.063412384 +0000 UTC m=+0.054116172 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Oct 10 09:46:15 compute-2 podman[80833]: 2025-10-10 09:46:15.192064715 +0000 UTC m=+0.182768483 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Oct 10 09:46:15 compute-2 sudo[80736]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 3.1b scrub starts
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 3.1b scrub ok
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 4.a scrub starts
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 4.a scrub ok
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 5.c scrub starts
Oct 10 09:46:15 compute-2 ceph-mon[74913]: 5.c scrub ok
Oct 10 09:46:15 compute-2 ceph-mon[74913]: mgrmap e14: compute-0.xkdepb(active, since 1.07712s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:15 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:15 compute-2 sudo[80918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:15 compute-2 sudo[80918]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:15 compute-2 sudo[80918]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:15 compute-2 sudo[80943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:46:15 compute-2 sudo[80943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:15 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Oct 10 09:46:15 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Oct 10 09:46:16 compute-2 sudo[80943]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:16 compute-2 sudo[81000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:16 compute-2 sudo[81000]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:16 compute-2 sudo[81000]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:16 compute-2 sudo[81025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Oct 10 09:46:16 compute-2 sudo[81025]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 5.e deep-scrub starts
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 5.e deep-scrub ok
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 4.d scrub starts
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 4.d scrub ok
Oct 10 09:46:16 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Bus STARTING
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Serving on https://192.168.122.100:7150
Oct 10 09:46:16 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Client ('192.168.122.100', 44336) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 6.f scrub starts
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 6.f scrub ok
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:16 compute-2 ceph-mon[74913]: 3.a scrub starts
Oct 10 09:46:16 compute-2 sudo[81025]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:16 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Oct 10 09:46:16 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Oct 10 09:46:17 compute-2 sudo[81069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:46:17 compute-2 sudo[81069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81069]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:46:17 compute-2 sudo[81094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81094]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:17 compute-2 sudo[81119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81119]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Serving on http://192.168.122.100:8765
Oct 10 09:46:17 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Bus STARTED
Oct 10 09:46:17 compute-2 ceph-mon[74913]: 4.1c scrub starts
Oct 10 09:46:17 compute-2 ceph-mon[74913]: 4.1c scrub ok
Oct 10 09:46:17 compute-2 ceph-mon[74913]: pgmap v4: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='client.14385 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:17 compute-2 ceph-mon[74913]: 3.a scrub ok
Oct 10 09:46:17 compute-2 ceph-mon[74913]: 3.b deep-scrub starts
Oct 10 09:46:17 compute-2 ceph-mon[74913]: 3.b deep-scrub ok
Oct 10 09:46:17 compute-2 ceph-mon[74913]: mgrmap e15: compute-0.xkdepb(active, since 2s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 sudo[81144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:17 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:46:17 compute-2 sudo[81144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81144]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:17 compute-2 sudo[81169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81169]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:17 compute-2 sudo[81217]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81217]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Oct 10 09:46:17 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Oct 10 09:46:17 compute-2 sudo[81242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:17 compute-2 sudo[81242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81242]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Oct 10 09:46:17 compute-2 sudo[81267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81267]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:17 compute-2 sudo[81292]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:17 compute-2 sudo[81292]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:17 compute-2 sudo[81317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:18 compute-2 sudo[81317]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81317]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:18 compute-2 sudo[81342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81342]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:18 compute-2 sudo[81367]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81367]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:18 compute-2 sudo[81392]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81392]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:18 compute-2 sudo[81440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81440]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:18 compute-2 sudo[81465]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81465]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:18 compute-2 sudo[81490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81490]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:46:18 compute-2 sudo[81515]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81515]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81540]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:46:18 compute-2 sudo[81540]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81540]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:18 compute-2 sudo[81565]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81565]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 6.1e scrub starts
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 6.1e scrub ok
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 09:46:18 compute-2 ceph-mon[74913]: from='client.14397 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 6.15 scrub starts
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 6.15 scrub ok
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to 128.0M
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-1 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 09:46:18 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 4.b scrub starts
Oct 10 09:46:18 compute-2 ceph-mon[74913]: 4.b scrub ok
Oct 10 09:46:18 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:18 compute-2 sudo[81590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:18 compute-2 sudo[81590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81590]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81615]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:18 compute-2 sudo[81615]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81615]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:18 compute-2 sudo[81663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81663]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Oct 10 09:46:18 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Oct 10 09:46:18 compute-2 sudo[81688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:18 compute-2 sudo[81688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81688]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:18 compute-2 sudo[81713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:18 compute-2 sudo[81713]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:18 compute-2 sudo[81713]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:19 compute-2 sudo[81738]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81738]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:19 compute-2 sudo[81763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81763]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:19 compute-2 sudo[81788]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81788]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:19 compute-2 sudo[81813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81813]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:19 compute-2 sudo[81838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81838]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:19 compute-2 sudo[81886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81886]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:19 compute-2 sudo[81911]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81911]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 sudo[81936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:19 compute-2 sudo[81936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:19 compute-2 sudo[81936]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 3.1a scrub starts
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 3.1a scrub ok
Oct 10 09:46:19 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:19 compute-2 ceph-mon[74913]: pgmap v5: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:19 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='client.14403 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:19 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 6.7 scrub starts
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 6.7 scrub ok
Oct 10 09:46:19 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:19 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:19 compute-2 ceph-mon[74913]: mgrmap e16: compute-0.xkdepb(active, since 4s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 5.a scrub starts
Oct 10 09:46:19 compute-2 ceph-mon[74913]: 5.a scrub ok
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:19 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:19 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Oct 10 09:46:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:19 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr respawn  exe_path /proc/self/exe
Oct 10 09:46:20 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:20 compute-2 ceph-mon[74913]: from='client.24217 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 4.15 scrub starts
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 4.15 scrub ok
Oct 10 09:46:20 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:20 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 5.7 scrub starts
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 5.7 scrub ok
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 6.9 deep-scrub starts
Oct 10 09:46:20 compute-2 ceph-mon[74913]: 6.9 deep-scrub ok
Oct 10 09:46:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1314314115' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Oct 10 09:46:20 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:20 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:20 compute-2 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:20 compute-2 sshd-session[80710]: Connection closed by 192.168.122.100 port 44966
Oct 10 09:46:20 compute-2 sshd-session[80707]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:46:20 compute-2 systemd[1]: session-34.scope: Deactivated successfully.
Oct 10 09:46:20 compute-2 systemd[1]: session-34.scope: Consumed 4.416s CPU time.
Oct 10 09:46:20 compute-2 systemd-logind[796]: Session 34 logged out. Waiting for processes to exit.
Oct 10 09:46:20 compute-2 systemd-logind[796]: Removed session 34.
Oct 10 09:46:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 09:46:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 09:46:20 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Oct 10 09:46:20 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:20 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 09:46:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:20.940+0000 7f9794936140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:21 compute-2 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:21 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 09:46:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:21.028+0000 7f9794936140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 4.3 scrub starts
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 4.3 scrub ok
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 5.2 scrub starts
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 5.2 scrub ok
Oct 10 09:46:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1314314115' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Oct 10 09:46:21 compute-2 ceph-mon[74913]: mgrmap e17: compute-0.xkdepb(active, since 6s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 6.b scrub starts
Oct 10 09:46:21 compute-2 ceph-mon[74913]: 6.b scrub ok
Oct 10 09:46:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2158945969' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Oct 10 09:46:21 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 09:46:21 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Oct 10 09:46:21 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Oct 10 09:46:21 compute-2 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:21 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 09:46:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:21.832+0000 7f9794936140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.490+0000 7f9794936140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:   from numpy import show_config as show_numpy_config
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.653+0000 7f9794936140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 4.1d scrub starts
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 4.1d scrub ok
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 6.5 scrub starts
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 6.5 scrub ok
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 4.17 scrub starts
Oct 10 09:46:22 compute-2 ceph-mon[74913]: 4.17 scrub ok
Oct 10 09:46:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2158945969' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Oct 10 09:46:22 compute-2 ceph-mon[74913]: mgrmap e18: compute-0.xkdepb(active, since 7s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.724+0000 7f9794936140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 09:46:22 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.9 deep-scrub starts
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 09:46:22 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.9 deep-scrub ok
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:22 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 09:46:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.869+0000 7f9794936140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 3.1d scrub starts
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 3.1d scrub ok
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 3.3 deep-scrub starts
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 3.3 deep-scrub ok
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 4.16 scrub starts
Oct 10 09:46:23 compute-2 ceph-mon[74913]: 4.16 scrub ok
Oct 10 09:46:23 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Oct 10 09:46:23 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:23.871+0000 7f9794936140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:23 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.093+0000 7f9794936140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.169+0000 7f9794936140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.232+0000 7f9794936140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.313+0000 7f9794936140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.383+0000 7f9794936140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.715+0000 7f9794936140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 3.9 deep-scrub starts
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 3.9 deep-scrub ok
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.1 scrub starts
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.1 scrub ok
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.17 scrub starts
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.17 scrub ok
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.12 scrub starts
Oct 10 09:46:24 compute-2 ceph-mon[74913]: 5.12 scrub ok
Oct 10 09:46:24 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Oct 10 09:46:24 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Oct 10 09:46:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.815+0000 7f9794936140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:24 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 09:46:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 09:46:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.270+0000 7f9794936140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 09:46:25 compute-2 ceph-mon[74913]: 3.5 scrub starts
Oct 10 09:46:25 compute-2 ceph-mon[74913]: 3.5 scrub ok
Oct 10 09:46:25 compute-2 ceph-mon[74913]: 6.14 scrub starts
Oct 10 09:46:25 compute-2 ceph-mon[74913]: 6.14 scrub ok
Oct 10 09:46:25 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.12 deep-scrub starts
Oct 10 09:46:25 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.12 deep-scrub ok
Oct 10 09:46:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.833+0000 7f9794936140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 09:46:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.904+0000 7f9794936140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 09:46:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.988+0000 7f9794936140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:25 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.136+0000 7f9794936140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.208+0000 7f9794936140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.380+0000 7f9794936140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.596+0000 7f9794936140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 5.4 scrub starts
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 5.4 scrub ok
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 4.e deep-scrub starts
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 4.e deep-scrub ok
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 3.12 scrub starts
Oct 10 09:46:26 compute-2 ceph-mon[74913]: 3.12 scrub ok
Oct 10 09:46:26 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc restarted
Oct 10 09:46:26 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc started
Oct 10 09:46:26 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Oct 10 09:46:26 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.891+0000 7f9794936140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 09:46:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.963+0000 7f9794936140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56089d6a9860 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 09:46:26 compute-2 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 09:46:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 09:46:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 09:46:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:27.194+0000 7f268f1ee140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 09:46:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:27.277+0000 7f268f1ee140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 6.12 deep-scrub starts
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 6.12 deep-scrub ok
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 5.f scrub starts
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 5.f scrub ok
Oct 10 09:46:27 compute-2 ceph-mon[74913]: mgrmap e19: compute-0.xkdepb(active, since 12s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 5.14 scrub starts
Oct 10 09:46:27 compute-2 ceph-mon[74913]: 5.14 scrub ok
Oct 10 09:46:27 compute-2 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 09:46:27 compute-2 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 09:46:27 compute-2 ceph-mon[74913]: osdmap e37: 3 total, 3 up, 3 in
Oct 10 09:46:27 compute-2 ceph-mon[74913]: mgrmap e20: compute-0.xkdepb(active, starting, since 0.0290775s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:27 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp restarted
Oct 10 09:46:27 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp started
Oct 10 09:46:27 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Oct 10 09:46:27 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Oct 10 09:46:27 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.079+0000 7f268f1ee140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.704+0000 7f268f1ee140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.1c scrub starts
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.1c scrub ok
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.3 scrub starts
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.3 scrub ok
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.16 scrub starts
Oct 10 09:46:28 compute-2 ceph-mon[74913]: 6.16 scrub ok
Oct 10 09:46:28 compute-2 ceph-mon[74913]: mgrmap e21: compute-0.xkdepb(active, starting, since 1.0465s), standbys: compute-2.gkrssp, compute-1.rfugxc
Oct 10 09:46:28 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Oct 10 09:46:28 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:   from numpy import show_config as show_numpy_config
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.885+0000 7f268f1ee140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 09:46:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.957+0000 7f268f1ee140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:46:28 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 09:46:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:29.091+0000 7f268f1ee140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 5.13 scrub starts
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 5.13 scrub ok
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.2 scrub starts
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.2 scrub ok
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.11 scrub starts
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.11 scrub ok
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.1 scrub starts
Oct 10 09:46:29 compute-2 ceph-mon[74913]: 6.1 scrub ok
Oct 10 09:46:29 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Oct 10 09:46:29 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Oct 10 09:46:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:29 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.091+0000 7f268f1ee140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.312+0000 7f268f1ee140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.386+0000 7f268f1ee140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.455+0000 7f268f1ee140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.538+0000 7f268f1ee140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.610+0000 7f268f1ee140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 6.d scrub starts
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 6.d scrub ok
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 6.10 scrub starts
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 6.10 scrub ok
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 3.8 scrub starts
Oct 10 09:46:30 compute-2 ceph-mon[74913]: 3.8 scrub ok
Oct 10 09:46:30 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Oct 10 09:46:30 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Oct 10 09:46:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.940+0000 7f268f1ee140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:46:30 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 09:46:30 compute-2 systemd[1]: Stopping User Manager for UID 42477...
Oct 10 09:46:30 compute-2 systemd[71724]: Activating special unit Exit the Session...
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped target Main User Target.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped target Basic System.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped target Paths.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped target Sockets.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped target Timers.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 09:46:30 compute-2 systemd[71724]: Closed D-Bus User Message Bus Socket.
Oct 10 09:46:30 compute-2 systemd[71724]: Stopped Create User's Volatile Files and Directories.
Oct 10 09:46:30 compute-2 systemd[71724]: Removed slice User Application Slice.
Oct 10 09:46:30 compute-2 systemd[71724]: Reached target Shutdown.
Oct 10 09:46:30 compute-2 systemd[71724]: Finished Exit the Session.
Oct 10 09:46:30 compute-2 systemd[71724]: Reached target Exit the Session.
Oct 10 09:46:30 compute-2 systemd[1]: user@42477.service: Deactivated successfully.
Oct 10 09:46:30 compute-2 systemd[1]: Stopped User Manager for UID 42477.
Oct 10 09:46:30 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Oct 10 09:46:31 compute-2 systemd[1]: run-user-42477.mount: Deactivated successfully.
Oct 10 09:46:31 compute-2 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Oct 10 09:46:31 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Oct 10 09:46:31 compute-2 systemd[1]: Removed slice User Slice of UID 42477.
Oct 10 09:46:31 compute-2 systemd[1]: user-42477.slice: Consumed 1min 7.750s CPU time.
Oct 10 09:46:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:31.050+0000 7f268f1ee140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:31 compute-2 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:46:31 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 09:46:31 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 09:46:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:31.492+0000 7f268f1ee140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:31 compute-2 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:46:31 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 6.e scrub starts
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 6.e scrub ok
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 6.13 scrub starts
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 6.13 scrub ok
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 2.15 scrub starts
Oct 10 09:46:31 compute-2 ceph-mon[74913]: 2.15 scrub ok
Oct 10 09:46:31 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Oct 10 09:46:31 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.097+0000 7f268f1ee140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.187+0000 7f268f1ee140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.273+0000 7f268f1ee140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.418+0000 7f268f1ee140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.484+0000 7f268f1ee140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.638+0000 7f268f1ee140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 5.1c scrub starts
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 5.1c scrub ok
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 5.1e scrub starts
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 5.1e scrub ok
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 2.12 deep-scrub starts
Oct 10 09:46:32 compute-2 ceph-mon[74913]: 2.12 deep-scrub ok
Oct 10 09:46:32 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc restarted
Oct 10 09:46:32 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc started
Oct 10 09:46:32 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Oct 10 09:46:32 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Oct 10 09:46:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.860+0000 7f268f1ee140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:46:32 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 09:46:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:33.111+0000 7f268f1ee140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 09:46:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:33.183+0000 7f268f1ee140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x55e60c85f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 09:46:33 compute-2 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 09:46:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Oct 10 09:46:33 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Oct 10 09:46:33 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 4.c scrub starts
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 4.c scrub ok
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 6.1d scrub starts
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 6.1d scrub ok
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 2.13 scrub starts
Oct 10 09:46:33 compute-2 ceph-mon[74913]: 2.13 scrub ok
Oct 10 09:46:33 compute-2 ceph-mon[74913]: mgrmap e22: compute-0.xkdepb(active, starting, since 5s), standbys: compute-2.gkrssp, compute-1.rfugxc
Oct 10 09:46:33 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp restarted
Oct 10 09:46:33 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp started
Oct 10 09:46:33 compute-2 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 09:46:33 compute-2 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 09:46:33 compute-2 ceph-mon[74913]: osdmap e38: 3 total, 3 up, 3 in
Oct 10 09:46:33 compute-2 ceph-mon[74913]: mgrmap e23: compute-0.xkdepb(active, starting, since 0.0311202s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-0.xkdepb", "id": "compute-0.xkdepb"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-1.rfugxc", "id": "compute-1.rfugxc"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gkrssp", "id": "compute-2.gkrssp"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 09:46:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 09:46:33 compute-2 sshd-session[82037]: Accepted publickey for ceph-admin from 192.168.122.100 port 55678 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:46:33 compute-2 systemd[1]: Created slice User Slice of UID 42477.
Oct 10 09:46:33 compute-2 systemd[1]: Starting User Runtime Directory /run/user/42477...
Oct 10 09:46:33 compute-2 systemd-logind[796]: New session 35 of user ceph-admin.
Oct 10 09:46:33 compute-2 systemd[1]: Finished User Runtime Directory /run/user/42477.
Oct 10 09:46:33 compute-2 systemd[1]: Starting User Manager for UID 42477...
Oct 10 09:46:33 compute-2 systemd[82041]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:46:34 compute-2 systemd[82041]: Queued start job for default target Main User Target.
Oct 10 09:46:34 compute-2 systemd[82041]: Created slice User Application Slice.
Oct 10 09:46:34 compute-2 systemd[82041]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 09:46:34 compute-2 systemd[82041]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 09:46:34 compute-2 systemd[82041]: Reached target Paths.
Oct 10 09:46:34 compute-2 systemd[82041]: Reached target Timers.
Oct 10 09:46:34 compute-2 systemd[82041]: Starting D-Bus User Message Bus Socket...
Oct 10 09:46:34 compute-2 systemd[82041]: Starting Create User's Volatile Files and Directories...
Oct 10 09:46:34 compute-2 systemd[82041]: Listening on D-Bus User Message Bus Socket.
Oct 10 09:46:34 compute-2 systemd[82041]: Reached target Sockets.
Oct 10 09:46:34 compute-2 systemd[82041]: Finished Create User's Volatile Files and Directories.
Oct 10 09:46:34 compute-2 systemd[82041]: Reached target Basic System.
Oct 10 09:46:34 compute-2 systemd[82041]: Reached target Main User Target.
Oct 10 09:46:34 compute-2 systemd[82041]: Startup finished in 107ms.
Oct 10 09:46:34 compute-2 systemd[1]: Started User Manager for UID 42477.
Oct 10 09:46:34 compute-2 systemd[1]: Started Session 35 of User ceph-admin.
Oct 10 09:46:34 compute-2 sshd-session[82037]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:46:34 compute-2 sudo[82056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:34 compute-2 sudo[82056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:34 compute-2 sudo[82056]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:34 compute-2 sudo[82081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:46:34 compute-2 sudo[82081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e2 new map
Oct 10 09:46:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e2 print_map
                                           e2
                                           btime 2025-10-10T09:46:34:511425+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:46:34.511367+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Oct 10 09:46:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Oct 10 09:46:34 compute-2 podman[82175]: 2025-10-10 09:46:34.824726368 +0000 UTC m=+0.065211100 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:46:34 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Oct 10 09:46:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:34 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 3.1c scrub starts
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 3.1c scrub ok
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 3.19 deep-scrub starts
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 3.19 deep-scrub ok
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 2.18 scrub starts
Oct 10 09:46:34 compute-2 ceph-mon[74913]: 2.18 scrub ok
Oct 10 09:46:34 compute-2 ceph-mon[74913]: mgrmap e24: compute-0.xkdepb(active, since 1.0568s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Oct 10 09:46:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Oct 10 09:46:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Oct 10 09:46:34 compute-2 ceph-mon[74913]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Oct 10 09:46:34 compute-2 ceph-mon[74913]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Oct 10 09:46:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Oct 10 09:46:34 compute-2 ceph-mon[74913]: osdmap e39: 3 total, 3 up, 3 in
Oct 10 09:46:34 compute-2 ceph-mon[74913]: fsmap cephfs:0
Oct 10 09:46:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:34 compute-2 podman[82175]: 2025-10-10 09:46:34.957235098 +0000 UTC m=+0.197719820 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:46:35 compute-2 sudo[82081]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:35 compute-2 sudo[82262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:35 compute-2 sudo[82262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:35 compute-2 sudo[82262]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:35 compute-2 sudo[82287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:46:35 compute-2 sudo[82287]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:35 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.f scrub starts
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 4.1b scrub starts
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 4.1b scrub ok
Oct 10 09:46:35 compute-2 ceph-mon[74913]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 2.19 scrub starts
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 2.19 scrub ok
Oct 10 09:46:35 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:34] ENGINE Bus STARTING
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 2.10 scrub starts
Oct 10 09:46:35 compute-2 ceph-mon[74913]: 2.10 scrub ok
Oct 10 09:46:35 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:34] ENGINE Serving on http://192.168.122.100:8765
Oct 10 09:46:35 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Serving on https://192.168.122.100:7150
Oct 10 09:46:35 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Bus STARTED
Oct 10 09:46:35 compute-2 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Client ('192.168.122.100', 60804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:35 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.f scrub ok
Oct 10 09:46:35 compute-2 sudo[82287]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:36 compute-2 sudo[82343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:36 compute-2 sudo[82343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:36 compute-2 sudo[82343]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:36 compute-2 sudo[82368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Oct 10 09:46:36 compute-2 sudo[82368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:36 compute-2 sudo[82368]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 5.1b scrub starts
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 5.1b scrub ok
Oct 10 09:46:36 compute-2 ceph-mon[74913]: pgmap v5: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='client.14469 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:36 compute-2 ceph-mon[74913]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 2.6 scrub starts
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 2.6 scrub ok
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 2.f scrub starts
Oct 10 09:46:36 compute-2 ceph-mon[74913]: 2.f scrub ok
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:36 compute-2 ceph-mon[74913]: mgrmap e25: compute-0.xkdepb(active, since 2s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:36 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.c scrub starts
Oct 10 09:46:36 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.c scrub ok
Oct 10 09:46:37 compute-2 sudo[82410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:46:37 compute-2 sudo[82410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82410]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:46:37 compute-2 sudo[82435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82435]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:37 compute-2 sudo[82460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82460]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:37 compute-2 sudo[82485]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82485]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Oct 10 09:46:37 compute-2 sudo[82510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:37 compute-2 sudo[82510]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82510]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:37 compute-2 sudo[82558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82558]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:46:37 compute-2 sudo[82583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82583]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Oct 10 09:46:37 compute-2 sudo[82608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82608]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 sudo[82633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:37 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.d scrub starts
Oct 10 09:46:37 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.d scrub ok
Oct 10 09:46:37 compute-2 sudo[82633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82633]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 4.18 scrub starts
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 4.18 scrub ok
Oct 10 09:46:37 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 09:46:37 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='client.14481 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 2.e scrub starts
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 2.e scrub ok
Oct 10 09:46:37 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 09:46:37 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 2.c scrub starts
Oct 10 09:46:37 compute-2 ceph-mon[74913]: 2.c scrub ok
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Oct 10 09:46:37 compute-2 ceph-mon[74913]: osdmap e40: 3 total, 3 up, 3 in
Oct 10 09:46:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Oct 10 09:46:37 compute-2 sudo[82658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:37 compute-2 sudo[82658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:37 compute-2 sudo[82658]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:38 compute-2 sudo[82683]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82683]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:38 compute-2 sudo[82708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82708]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:38 compute-2 sudo[82733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82733]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:38 compute-2 sudo[82781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82781]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:46:38 compute-2 sudo[82806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82806]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:38 compute-2 sudo[82831]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82831]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:46:38 compute-2 sudo[82856]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82856]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:46:38 compute-2 sudo[82881]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82881]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Oct 10 09:46:38 compute-2 sudo[82906]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:38 compute-2 sudo[82906]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82906]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:38 compute-2 sudo[82931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82931]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[82956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:38 compute-2 sudo[82956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[82956]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[83004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:38 compute-2 sudo[83004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[83004]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Oct 10 09:46:38 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Oct 10 09:46:38 compute-2 sudo[83029]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:46:38 compute-2 sudo[83029]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[83029]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to 128.0M
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Unable to set osd_memory_target on compute-1 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: 4.1a scrub starts
Oct 10 09:46:38 compute-2 ceph-mon[74913]: 4.1a scrub ok
Oct 10 09:46:38 compute-2 ceph-mon[74913]: pgmap v6: 162 pgs: 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: 2.d scrub starts
Oct 10 09:46:38 compute-2 ceph-mon[74913]: 2.d scrub ok
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:46:38 compute-2 ceph-mon[74913]: mgrmap e26: compute-0.xkdepb(active, since 4s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:38 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Oct 10 09:46:38 compute-2 ceph-mon[74913]: osdmap e41: 3 total, 3 up, 3 in
Oct 10 09:46:38 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:38 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:38 compute-2 sudo[83054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:38 compute-2 sudo[83054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[83054]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:38 compute-2 sudo[83079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:38 compute-2 sudo[83079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:38 compute-2 sudo[83079]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:46:39 compute-2 sudo[83104]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83104]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:39 compute-2 sudo[83129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83129]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:39 compute-2 sudo[83154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83154]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83179]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:39 compute-2 sudo[83179]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83179]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:39 compute-2 sudo[83227]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83227]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:46:39 compute-2 sudo[83252]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83252]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 sudo[83277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:39 compute-2 sudo[83277]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:39 compute-2 sudo[83277]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Oct 10 09:46:39 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.b scrub starts
Oct 10 09:46:39 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.b scrub ok
Oct 10 09:46:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:39 compute-2 ceph-mon[74913]: 4.5 scrub starts
Oct 10 09:46:39 compute-2 ceph-mon[74913]: 4.5 scrub ok
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:46:39 compute-2 ceph-mon[74913]: 2.5 scrub starts
Oct 10 09:46:39 compute-2 ceph-mon[74913]: 2.5 scrub ok
Oct 10 09:46:39 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:39 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:39 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:39 compute-2 ceph-mon[74913]: osdmap e42: 3 total, 3 up, 3 in
Oct 10 09:46:39 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:40 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Oct 10 09:46:40 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Oct 10 09:46:40 compute-2 ceph-mon[74913]: 5.18 deep-scrub starts
Oct 10 09:46:40 compute-2 ceph-mon[74913]: 5.18 deep-scrub ok
Oct 10 09:46:40 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:40 compute-2 ceph-mon[74913]: pgmap v9: 163 pgs: 1 unknown, 162 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:40 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:46:40 compute-2 ceph-mon[74913]: 2.b scrub starts
Oct 10 09:46:40 compute-2 ceph-mon[74913]: 2.b scrub ok
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:40 compute-2 ceph-mon[74913]: mgrmap e27: compute-0.xkdepb(active, since 7s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/200213662' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Oct 10 09:46:40 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/200213662' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Oct 10 09:46:41 compute-2 ceph-mon[74913]: 6.19 scrub starts
Oct 10 09:46:41 compute-2 ceph-mon[74913]: 6.19 scrub ok
Oct 10 09:46:41 compute-2 ceph-mon[74913]: Deploying daemon node-exporter.compute-0 on compute-0
Oct 10 09:46:41 compute-2 ceph-mon[74913]: 2.1b scrub starts
Oct 10 09:46:41 compute-2 ceph-mon[74913]: 2.1b scrub ok
Oct 10 09:46:42 compute-2 ceph-mon[74913]: 6.1a scrub starts
Oct 10 09:46:42 compute-2 ceph-mon[74913]: 6.1a scrub ok
Oct 10 09:46:42 compute-2 ceph-mon[74913]: pgmap v11: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Oct 10 09:46:42 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1404388837' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Oct 10 09:46:42 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:42 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:42 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:43 compute-2 ceph-mon[74913]: Deploying daemon node-exporter.compute-1 on compute-1
Oct 10 09:46:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4210446203' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 09:46:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:45 compute-2 ceph-mon[74913]: pgmap v12: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Oct 10 09:46:45 compute-2 sudo[83302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:45 compute-2 sudo[83302]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:45 compute-2 sudo[83302]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:45 compute-2 sudo[83327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:45 compute-2 sudo[83327]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1088819812' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Oct 10 09:46:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:46 compute-2 systemd[1]: Reloading.
Oct 10 09:46:46 compute-2 systemd-rc-local-generator[83416]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:46 compute-2 systemd-sysv-generator[83420]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:46 compute-2 systemd[1]: Reloading.
Oct 10 09:46:46 compute-2 systemd-rc-local-generator[83457]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:46 compute-2 systemd-sysv-generator[83461]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:46 compute-2 systemd[1]: Starting Ceph node-exporter.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:46:47 compute-2 ceph-mon[74913]: pgmap v13: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Oct 10 09:46:47 compute-2 ceph-mon[74913]: Deploying daemon node-exporter.compute-2 on compute-2
Oct 10 09:46:47 compute-2 bash[83514]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Oct 10 09:46:47 compute-2 bash[83514]: Getting image source signatures
Oct 10 09:46:47 compute-2 bash[83514]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Oct 10 09:46:47 compute-2 bash[83514]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Oct 10 09:46:47 compute-2 bash[83514]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Oct 10 09:46:48 compute-2 bash[83514]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Oct 10 09:46:48 compute-2 bash[83514]: Writing manifest to image destination
Oct 10 09:46:48 compute-2 podman[83514]: 2025-10-10 09:46:48.194247924 +0000 UTC m=+1.134300309 container create 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:46:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8feaa84fe6ee4c345b37cd80291573594d2180df8e2bf40f472c980aaaef067b/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Oct 10 09:46:48 compute-2 podman[83514]: 2025-10-10 09:46:48.244436694 +0000 UTC m=+1.184489099 container init 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:46:48 compute-2 podman[83514]: 2025-10-10 09:46:48.24913861 +0000 UTC m=+1.189190995 container start 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:46:48 compute-2 bash[83514]: 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f
Oct 10 09:46:48 compute-2 podman[83514]: 2025-10-10 09:46:48.181345734 +0000 UTC m=+1.121398139 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.255Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.255Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=arp
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=bcache
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=bonding
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=cpu
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=dmi
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=edac
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=entropy
Oct 10 09:46:48 compute-2 systemd[1]: Started Ceph node-exporter.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=filefd
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=hwmon
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netclass
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netdev
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netstat
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nfs
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nvme
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=os
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=pressure
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=rapl
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=selinux
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=softnet
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=stat
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=textfile
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=thermal_zone
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=time
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=uname
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=xfs
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=zfs
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.259Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Oct 10 09:46:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.260Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Oct 10 09:46:48 compute-2 sudo[83327]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='client.14517 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Oct 10 09:46:49 compute-2 ceph-mon[74913]: pgmap v14: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:50 compute-2 ceph-mon[74913]: pgmap v15: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 0 B/s wr, 7 op/s
Oct 10 09:46:50 compute-2 ceph-mon[74913]: from='client.14523 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Oct 10 09:46:52 compute-2 ceph-mon[74913]: from='client.14529 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Oct 10 09:46:52 compute-2 ceph-mon[74913]: pgmap v16: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:53 compute-2 sudo[83599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:53 compute-2 sudo[83599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:53 compute-2 sudo[83599]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:53 compute-2 sudo[83624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:53 compute-2 sudo[83624]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.584299112 +0000 UTC m=+0.033283365 container create a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct 10 09:46:53 compute-2 systemd[1]: Started libpod-conmon-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope.
Oct 10 09:46:53 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.64726273 +0000 UTC m=+0.096247003 container init a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.65388395 +0000 UTC m=+0.102868213 container start a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.657026714 +0000 UTC m=+0.106010957 container attach a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:46:53 compute-2 musing_tharp[83706]: 167 167
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.659131004 +0000 UTC m=+0.108115257 container died a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Oct 10 09:46:53 compute-2 systemd[1]: libpod-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope: Deactivated successfully.
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.568785117 +0000 UTC m=+0.017769390 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:46:53 compute-2 systemd[1]: var-lib-containers-storage-overlay-915c4c4a853f005b9b4db784cfbf18718f803b70c72649a21bab246578cd4cef-merged.mount: Deactivated successfully.
Oct 10 09:46:53 compute-2 podman[83690]: 2025-10-10 09:46:53.693190483 +0000 UTC m=+0.142174736 container remove a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:46:53 compute-2 systemd[1]: libpod-conmon-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope: Deactivated successfully.
Oct 10 09:46:53 compute-2 systemd[1]: Reloading.
Oct 10 09:46:53 compute-2 systemd-rc-local-generator[83751]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:53 compute-2 systemd-sysv-generator[83754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:54 compute-2 systemd[1]: Reloading.
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='client.14535 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qujzwn", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qujzwn", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:54 compute-2 systemd-sysv-generator[83794]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:54 compute-2 systemd-rc-local-generator[83789]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:54 compute-2 systemd[1]: Starting Ceph rgw.rgw.compute-2.qujzwn for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:46:54 compute-2 podman[83847]: 2025-10-10 09:46:54.501133177 +0000 UTC m=+0.036061227 container create 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:46:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:46:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:46:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:46:54 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.qujzwn supports timestamps until 2038 (0x7fffffff)
Oct 10 09:46:54 compute-2 podman[83847]: 2025-10-10 09:46:54.559360128 +0000 UTC m=+0.094288198 container init 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:46:54 compute-2 podman[83847]: 2025-10-10 09:46:54.563971011 +0000 UTC m=+0.098899061 container start 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct 10 09:46:54 compute-2 bash[83847]: 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20
Oct 10 09:46:54 compute-2 podman[83847]: 2025-10-10 09:46:54.483367537 +0000 UTC m=+0.018295617 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:46:54 compute-2 systemd[1]: Started Ceph rgw.rgw.compute-2.qujzwn for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:46:54 compute-2 radosgw[83867]: deferred set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:46:54 compute-2 radosgw[83867]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Oct 10 09:46:54 compute-2 radosgw[83867]: framework: beast
Oct 10 09:46:54 compute-2 radosgw[83867]: framework conf key: endpoint, val: 192.168.122.102:8082
Oct 10 09:46:54 compute-2 radosgw[83867]: init_numa not setting numa affinity
Oct 10 09:46:54 compute-2 sudo[83624]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:55 compute-2 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-2.qujzwn on compute-2
Oct 10 09:46:55 compute-2 ceph-mon[74913]: pgmap v17: 163 pgs: 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/117532342' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.zajetc", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.zajetc", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:55 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Oct 10 09:46:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Oct 10 09:46:55 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2866042771' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 09:46:56 compute-2 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-1.zajetc on compute-1
Oct 10 09:46:56 compute-2 ceph-mon[74913]: osdmap e43: 3 total, 3 up, 3 in
Oct 10 09:46:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2866042771' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 09:46:56 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 09:46:56 compute-2 ceph-mon[74913]: pgmap v19: 164 pgs: 1 unknown, 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Oct 10 09:46:57 compute-2 ceph-mon[74913]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4039652738' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Oct 10 09:46:57 compute-2 ceph-mon[74913]: osdmap e44: 3 total, 3 up, 3 in
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.myiozw", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.myiozw", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:57 compute-2 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-0.myiozw on compute-0
Oct 10 09:46:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Oct 10 09:46:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Oct 10 09:46:57 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: osdmap e45: 3 total, 3 up, 3 in
Oct 10 09:46:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: pgmap v22: 165 pgs: 2 unknown, 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:46:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/897202661' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Oct 10 09:46:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Oct 10 09:46:58 compute-2 sudo[84454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:46:58 compute-2 sudo[84454]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:58 compute-2 sudo[84454]: pam_unix(sudo:session): session closed for user root
Oct 10 09:46:58 compute-2 sudo[84479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:46:58 compute-2 sudo[84479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.188804323 +0000 UTC m=+0.035001022 container create 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Oct 10 09:46:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Oct 10 09:46:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Oct 10 09:46:59 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct 10 09:46:59 compute-2 ceph-mon[74913]: osdmap e46: 3 total, 3 up, 3 in
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:59 compute-2 ceph-mon[74913]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vlgajy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vlgajy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 09:46:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:46:59 compute-2 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-2.vlgajy on compute-2
Oct 10 09:46:59 compute-2 systemd[1]: Started libpod-conmon-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope.
Oct 10 09:46:59 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.268249007 +0000 UTC m=+0.114445726 container init 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.173007618 +0000 UTC m=+0.019204357 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.273723268 +0000 UTC m=+0.119919977 container start 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.276346805 +0000 UTC m=+0.122543544 container attach 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325)
Oct 10 09:46:59 compute-2 strange_perlman[84560]: 167 167
Oct 10 09:46:59 compute-2 systemd[1]: libpod-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope: Deactivated successfully.
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.279366485 +0000 UTC m=+0.125563204 container died 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Oct 10 09:46:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-ea4e5282219a9c3ad20e5648cbd3f6dabaf721bc8bd7c3e59a27979325aa1cae-merged.mount: Deactivated successfully.
Oct 10 09:46:59 compute-2 podman[84544]: 2025-10-10 09:46:59.317046025 +0000 UTC m=+0.163242734 container remove 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:46:59 compute-2 systemd[1]: libpod-conmon-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope: Deactivated successfully.
Oct 10 09:46:59 compute-2 systemd[1]: Reloading.
Oct 10 09:46:59 compute-2 systemd-sysv-generator[84606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:59 compute-2 systemd-rc-local-generator[84602]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:59 compute-2 systemd[1]: Reloading.
Oct 10 09:46:59 compute-2 systemd-sysv-generator[84646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:46:59 compute-2 systemd-rc-local-generator[84637]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:46:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:46:59 compute-2 systemd[1]: Starting Ceph mds.cephfs.compute-2.vlgajy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:47:00 compute-2 podman[84704]: 2025-10-10 09:47:00.069028813 +0000 UTC m=+0.038848229 container create ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Oct 10 09:47:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.vlgajy supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:00 compute-2 podman[84704]: 2025-10-10 09:47:00.133377157 +0000 UTC m=+0.103196563 container init ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 09:47:00 compute-2 podman[84704]: 2025-10-10 09:47:00.13888457 +0000 UTC m=+0.108703986 container start ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:47:00 compute-2 bash[84704]: ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109
Oct 10 09:47:00 compute-2 podman[84704]: 2025-10-10 09:47:00.049234777 +0000 UTC m=+0.019054203 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:47:00 compute-2 systemd[1]: Started Ceph mds.cephfs.compute-2.vlgajy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:47:00 compute-2 ceph-mds[84723]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:47:00 compute-2 ceph-mds[84723]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Oct 10 09:47:00 compute-2 ceph-mds[84723]: main not setting numa affinity
Oct 10 09:47:00 compute-2 ceph-mds[84723]: pidfile_write: ignore empty --pid-file
Oct 10 09:47:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy[84719]: starting mds.cephfs.compute-2.vlgajy at 
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 2 from mon.1
Oct 10 09:47:00 compute-2 sudo[84479]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Oct 10 09:47:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e3 new map
Oct 10 09:47:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e3 print_map
                                           e3
                                           btime 2025-10-10T09:47:00:211513+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:46:34.511367+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.vlgajy{-1:24337} state up:standby seq 1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:00 compute-2 ceph-mon[74913]: osdmap e47: 3 total, 3 up, 3 in
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: pgmap v25: 166 pgs: 3 unknown, 163 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4100066023' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 09:47:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 09:47:00 compute-2 ceph-mon[74913]: osdmap e48: 3 total, 3 up, 3 in
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 3 from mon.1
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Monitors have assigned me to become a standby
Oct 10 09:47:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e4 new map
Oct 10 09:47:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e4 print_map
                                           e4
                                           btime 2025-10-10T09:47:00:244509+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:00.244232+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:creating seq 1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 4 from mon.1
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.4 handle_mds_map I am now mds.0.4
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x1
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x100
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x600
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x601
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x602
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x603
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x604
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x605
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x606
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x607
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x608
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x609
Oct 10 09:47:00 compute-2 ceph-mds[84723]: mds.0.4 creating_done
Oct 10 09:47:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Oct 10 09:47:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Oct 10 09:47:01 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:01 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] up:boot
Oct 10 09:47:01 compute-2 ceph-mon[74913]: daemon mds.cephfs.compute-2.vlgajy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Oct 10 09:47:01 compute-2 ceph-mon[74913]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Oct 10 09:47:01 compute-2 ceph-mon[74913]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Oct 10 09:47:01 compute-2 ceph-mon[74913]: fsmap cephfs:0 1 up:standby
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.vlgajy"}]: dispatch
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:01 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:creating}
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.cchwlo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.cchwlo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 09:47:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:01 compute-2 ceph-mon[74913]: daemon mds.cephfs.compute-2.vlgajy is now active in filesystem cephfs as rank 0
Oct 10 09:47:01 compute-2 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-0.cchwlo on compute-0
Oct 10 09:47:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e5 new map
Oct 10 09:47:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e5 print_map
                                           e5
                                           btime 2025-10-10T09:47:01:287113+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:01.287110+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Oct 10 09:47:01 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 5 from mon.1
Oct 10 09:47:01 compute-2 ceph-mds[84723]: mds.0.4 handle_mds_map I am now mds.0.4
Oct 10 09:47:01 compute-2 ceph-mds[84723]: mds.0.4 handle_mds_map state change up:creating --> up:active
Oct 10 09:47:01 compute-2 ceph-mds[84723]: mds.0.4 recovery_done -- successful recovery!
Oct 10 09:47:01 compute-2 ceph-mds[84723]: mds.0.4 active_start
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Oct 10 09:47:02 compute-2 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: osdmap e49: 3 total, 3 up, 3 in
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] up:active
Oct 10 09:47:02 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active}
Oct 10 09:47:02 compute-2 ceph-mon[74913]: pgmap v28: 167 pgs: 1 unknown, 1 creating+peering, 165 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 6.0 KiB/s rd, 1.5 KiB/s wr, 8 op/s
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fhagzt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fhagzt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 09:47:02 compute-2 ceph-mon[74913]: osdmap e50: 3 total, 3 up, 3 in
Oct 10 09:47:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e6 new map
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e6 print_map
                                           e6
                                           btime 2025-10-10T09:47:02:297566+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:01.287110+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e7 new map
Oct 10 09:47:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e7 print_map
                                           e7
                                           btime 2025-10-10T09:47:02:322797+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:01.287110+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Oct 10 09:47:03 compute-2 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-1.fhagzt on compute-1
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 09:47:03 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] up:boot
Oct 10 09:47:03 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active} 1 up:standby
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.cchwlo"}]: dispatch
Oct 10 09:47:03 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active} 1 up:standby
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 09:47:03 compute-2 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 09:47:03 compute-2 ceph-mon[74913]: osdmap e51: 3 total, 3 up, 3 in
Oct 10 09:47:03 compute-2 radosgw[83867]: v1 topic migration: starting v1 topic migration..
Oct 10 09:47:03 compute-2 radosgw[83867]: LDAP not started since no server URIs were provided in the configuration.
Oct 10 09:47:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn[83863]: 2025-10-10T09:47:03.484+0000 7f3ea7b78980 -1 LDAP not started since no server URIs were provided in the configuration.
Oct 10 09:47:03 compute-2 radosgw[83867]: v1 topic migration: finished v1 topic migration
Oct 10 09:47:03 compute-2 radosgw[83867]: framework: beast
Oct 10 09:47:03 compute-2 radosgw[83867]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Oct 10 09:47:03 compute-2 radosgw[83867]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Oct 10 09:47:03 compute-2 radosgw[83867]: starting handler: beast
Oct 10 09:47:03 compute-2 radosgw[83867]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 09:47:03 compute-2 radosgw[83867]: mgrc service_daemon_register rgw.24304 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.qujzwn,kernel_description=#1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025,kernel_version=5.14.0-621.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864356,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=ac475a20-bf0e-4531-bd8b-a44afde7c93f,zone_name=default,zonegroup_id=8929b431-04ce-48e1-bb4a-cedab812d97d,zonegroup_name=default}
Oct 10 09:47:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 09:47:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Oct 10 09:47:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Oct 10 09:47:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct 10 09:47:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Oct 10 09:47:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e8 new map
Oct 10 09:47:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e8 print_map
                                           e8
                                           btime 2025-10-10T09:47:04:615775+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:04.295946+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:04 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 8 from mon.1
Oct 10 09:47:04 compute-2 ceph-mon[74913]: pgmap v31: 167 pgs: 1 unknown, 1 creating+peering, 165 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 6.0 KiB/s rd, 1.5 KiB/s wr, 8 op/s
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 09:47:04 compute-2 ceph-mon[74913]: Cluster is now healthy
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:47:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:05 compute-2 ceph-mds[84723]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct 10 09:47:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy[84719]: 2025-10-10T09:47:05.250+0000 7efc8f0e1640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.0.0.compute-1.mssvzx
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.0.0.compute-1.mssvzx-rgw
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Bind address in nfs.cephfs.0.0.compute-1.mssvzx's ganesha conf is defaulting to empty
Oct 10 09:47:05 compute-2 ceph-mon[74913]: Deploying daemon nfs.cephfs.0.0.compute-1.mssvzx on compute-1
Oct 10 09:47:05 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] up:boot
Oct 10 09:47:05 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] up:active
Oct 10 09:47:05 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active} 2 up:standby
Oct 10 09:47:05 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.fhagzt"}]: dispatch
Oct 10 09:47:06 compute-2 ceph-mon[74913]: pgmap v32: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 241 KiB/s rd, 9.4 KiB/s wr, 445 op/s
Oct 10 09:47:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e9 new map
Oct 10 09:47:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e9 print_map
                                           e9
                                           btime 2025-10-10T09:47:06:672904+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:04.295946+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:07 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] up:standby
Oct 10 09:47:07 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active} 2 up:standby
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:07 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.1.0.compute-2.boccfy
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 09:47:07 compute-2 ceph-mon[74913]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 09:47:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e10 new map
Oct 10 09:47:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).mds e10 print_map
                                           e10
                                           btime 2025-10-10T09:47:08:789045+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-10-10T09:46:34.511367+0000
                                           modified        2025-10-10T09:47:04.295946+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24337}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24337 members: 24337
                                           [mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 09:47:08 compute-2 ceph-mon[74913]: pgmap v33: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 204 KiB/s rd, 8.0 KiB/s wr, 376 op/s
Oct 10 09:47:08 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:09 compute-2 ceph-mon[74913]: mds.? [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] up:standby
Oct 10 09:47:09 compute-2 ceph-mon[74913]: fsmap cephfs:1 {0=cephfs.compute-2.vlgajy=up:active} 2 up:standby
Oct 10 09:47:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:10 compute-2 sudo[84791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:47:10 compute-2 sudo[84791]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:10 compute-2 sudo[84791]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:10 compute-2 sudo[84816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:47:10 compute-2 sudo[84816]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.526056472 +0000 UTC m=+0.036730760 container create 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 10 09:47:10 compute-2 systemd[1]: Started libpod-conmon-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope.
Oct 10 09:47:10 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.598148022 +0000 UTC m=+0.108822350 container init 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.604983679 +0000 UTC m=+0.115657987 container start 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.511020833 +0000 UTC m=+0.021695151 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.60830112 +0000 UTC m=+0.118975438 container attach 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:47:10 compute-2 magical_hellman[84896]: 167 167
Oct 10 09:47:10 compute-2 systemd[1]: libpod-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope: Deactivated successfully.
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.609288882 +0000 UTC m=+0.119963180 container died 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 09:47:10 compute-2 systemd[1]: var-lib-containers-storage-overlay-ba91f52bfc249c2abaa9b2534619c7879ccc57ac6cf15fd7ef0a318f8659aad6-merged.mount: Deactivated successfully.
Oct 10 09:47:10 compute-2 podman[84880]: 2025-10-10 09:47:10.652407742 +0000 UTC m=+0.163082060 container remove 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 09:47:10 compute-2 systemd[1]: libpod-conmon-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope: Deactivated successfully.
Oct 10 09:47:10 compute-2 systemd[1]: Reloading.
Oct 10 09:47:10 compute-2 systemd-rc-local-generator[84938]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:10 compute-2 systemd-sysv-generator[84943]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:10 compute-2 ceph-mon[74913]: pgmap v34: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 160 KiB/s rd, 6.2 KiB/s wr, 295 op/s
Oct 10 09:47:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 09:47:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 09:47:10 compute-2 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 09:47:10 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.1.0.compute-2.boccfy-rgw
Oct 10 09:47:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:47:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:47:10 compute-2 ceph-mon[74913]: Bind address in nfs.cephfs.1.0.compute-2.boccfy's ganesha conf is defaulting to empty
Oct 10 09:47:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:10 compute-2 ceph-mon[74913]: Deploying daemon nfs.cephfs.1.0.compute-2.boccfy on compute-2
Oct 10 09:47:10 compute-2 systemd[1]: Reloading.
Oct 10 09:47:11 compute-2 systemd-rc-local-generator[84977]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:11 compute-2 systemd-sysv-generator[84980]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:11 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:47:11 compute-2 podman[85035]: 2025-10-10 09:47:11.535652613 +0000 UTC m=+0.065229334 container create c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 10 09:47:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:11 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:11 compute-2 podman[85035]: 2025-10-10 09:47:11.513531199 +0000 UTC m=+0.043107940 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:47:11 compute-2 podman[85035]: 2025-10-10 09:47:11.612552893 +0000 UTC m=+0.142129584 container init c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct 10 09:47:11 compute-2 podman[85035]: 2025-10-10 09:47:11.617176086 +0000 UTC m=+0.146752767 container start c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:47:11 compute-2 bash[85035]: c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292
Oct 10 09:47:11 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:47:11 compute-2 sudo[84816]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:47:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:47:12 compute-2 ceph-mon[74913]: pgmap v35: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 162 KiB/s rd, 6.1 KiB/s wr, 299 op/s
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.2.0.compute-0.ruydzo
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Creating key for client.nfs.cephfs.2.0.compute-0.ruydzo-rgw
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Bind address in nfs.cephfs.2.0.compute-0.ruydzo's ganesha conf is defaulting to empty
Oct 10 09:47:12 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:47:12 compute-2 ceph-mon[74913]: Deploying daemon nfs.cephfs.2.0.compute-0.ruydzo on compute-0
Oct 10 09:47:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:47:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:47:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:47:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:47:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:47:14 compute-2 ceph-mon[74913]: pgmap v36: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 146 KiB/s rd, 5.6 KiB/s wr, 270 op/s
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:14 compute-2 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-1.ehhoyw on compute-1
Oct 10 09:47:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:16 compute-2 ceph-mon[74913]: pgmap v37: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 130 KiB/s rd, 6.6 KiB/s wr, 239 op/s
Oct 10 09:47:18 compute-2 ceph-mon[74913]: pgmap v38: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 2.4 KiB/s wr, 42 op/s
Oct 10 09:47:18 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:19 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:19 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:19 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:19 compute-2 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-0.gptveb on compute-0
Oct 10 09:47:20 compute-2 ceph-mon[74913]: pgmap v39: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 2.4 KiB/s wr, 42 op/s
Oct 10 09:47:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:22 compute-2 ceph-mon[74913]: pgmap v40: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 2.4 KiB/s wr, 42 op/s
Oct 10 09:47:23 compute-2 sudo[85107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:47:23 compute-2 sudo[85107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:23 compute-2 sudo[85107]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:23 compute-2 sudo[85132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:47:23 compute-2 sudo[85132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:24 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:24 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:24 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:24 compute-2 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-2.eokdol on compute-2
Oct 10 09:47:24 compute-2 ceph-mon[74913]: pgmap v41: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.1 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Oct 10 09:47:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.249075285 +0000 UTC m=+2.497893669 container create 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.230611922 +0000 UTC m=+2.479430326 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 09:47:26 compute-2 systemd[1]: Started libpod-conmon-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope.
Oct 10 09:47:26 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.327110783 +0000 UTC m=+2.575929197 container init 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.332090378 +0000 UTC m=+2.580908762 container start 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.335375057 +0000 UTC m=+2.584193441 container attach 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 jolly_babbage[85315]: 0 0
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.338070796 +0000 UTC m=+2.586889190 container died 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 systemd[1]: libpod-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope: Deactivated successfully.
Oct 10 09:47:26 compute-2 systemd[1]: var-lib-containers-storage-overlay-efd1d3a6dde83fb0e26726be44a6f416d3dc461412ded846f2b3f411afda69e4-merged.mount: Deactivated successfully.
Oct 10 09:47:26 compute-2 podman[85197]: 2025-10-10 09:47:26.379170019 +0000 UTC m=+2.627988403 container remove 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 09:47:26 compute-2 systemd[1]: libpod-conmon-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope: Deactivated successfully.
Oct 10 09:47:26 compute-2 systemd[1]: Reloading.
Oct 10 09:47:26 compute-2 systemd-rc-local-generator[85359]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:26 compute-2 systemd-sysv-generator[85362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:26 compute-2 ceph-mon[74913]: pgmap v42: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.3 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Oct 10 09:47:26 compute-2 systemd[1]: Reloading.
Oct 10 09:47:26 compute-2 systemd-rc-local-generator[85402]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:26 compute-2 systemd-sysv-generator[85407]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:27 compute-2 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.eokdol for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:47:27 compute-2 podman[85459]: 2025-10-10 09:47:27.33932642 +0000 UTC m=+0.041931721 container create 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:47:27 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23862060fcbe060db6cccf6bb2d67f0a22678bd92e5ebb08f9d9555dbfd63796/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:27 compute-2 podman[85459]: 2025-10-10 09:47:27.411207015 +0000 UTC m=+0.113812416 container init 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:47:27 compute-2 podman[85459]: 2025-10-10 09:47:27.323778945 +0000 UTC m=+0.026384266 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 09:47:27 compute-2 podman[85459]: 2025-10-10 09:47:27.42616994 +0000 UTC m=+0.128775281 container start 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:47:27 compute-2 bash[85459]: 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0
Oct 10 09:47:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/094727 (2) : New worker #1 (4) forked
Oct 10 09:47:27 compute-2 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.eokdol for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:47:27 compute-2 sudo[85132]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:27 compute-2 sudo[85489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:47:27 compute-2 sudo[85489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:27 compute-2 sudo[85489]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:27 compute-2 sudo[85514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:47:27 compute-2 sudo[85514]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:28 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.417863) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648418082, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6519, "num_deletes": 256, "total_data_size": 17899569, "memory_usage": 19298368, "flush_reason": "Manual Compaction"}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648465917, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11386863, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6524, "table_properties": {"data_size": 11362444, "index_size": 15281, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8069, "raw_key_size": 78427, "raw_average_key_size": 24, "raw_value_size": 11300778, "raw_average_value_size": 3506, "num_data_blocks": 678, "num_entries": 3223, "num_filter_entries": 3223, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 1760089519, "file_creation_time": 1760089648, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 48147 microseconds, and 22076 cpu microseconds.
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.466004) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11386863 bytes OK
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.466050) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469497) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469520) EVENT_LOG_v1 {"time_micros": 1760089648469513, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469548) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 17864492, prev total WAL file size 17864492, number of live WAL files 2.
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.476038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1648B)]
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648476202, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11388511, "oldest_snapshot_seqno": -1}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:28 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:28 compute-2 ceph-mon[74913]: pgmap v43: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:47:28 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:28 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:28 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 09:47:28 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 09:47:28 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 09:47:28 compute-2 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-2.fcbgvm on compute-2
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 2970 keys, 11383085 bytes, temperature: kUnknown
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648562681, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11383085, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11359283, "index_size": 15245, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7493, "raw_key_size": 74964, "raw_average_key_size": 25, "raw_value_size": 11300836, "raw_average_value_size": 3804, "num_data_blocks": 676, "num_entries": 2970, "num_filter_entries": 2970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089648, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.563169) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11383085 bytes
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.564734) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.4 rd, 131.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3228, records dropped: 258 output_compression: NoCompression
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.564784) EVENT_LOG_v1 {"time_micros": 1760089648564752, "job": 4, "event": "compaction_finished", "compaction_time_micros": 86659, "compaction_time_cpu_micros": 44766, "output_level": 6, "num_output_files": 1, "total_output_size": 11383085, "num_input_records": 3228, "num_output_records": 2970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648569218, "job": 4, "event": "table_file_deletion", "file_number": 14}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648569326, "job": 4, "event": "table_file_deletion", "file_number": 8}
Oct 10 09:47:28 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.475810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:30 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:30 compute-2 ceph-mon[74913]: pgmap v44: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.24875015 +0000 UTC m=+3.138721491 container create cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, version=2.2.4, name=keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.234080223 +0000 UTC m=+3.124051564 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 09:47:31 compute-2 systemd[1]: Started libpod-conmon-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope.
Oct 10 09:47:31 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.309628379 +0000 UTC m=+3.199599750 container init cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, io.buildah.version=1.28.2, architecture=x86_64, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, distribution-scope=public, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.317586572 +0000 UTC m=+3.207557913 container start cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, vcs-type=git, com.redhat.component=keepalived-container)
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.320666725 +0000 UTC m=+3.210638116 container attach cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, vcs-type=git, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, release=1793, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., name=keepalived, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 09:47:31 compute-2 sad_yalow[85679]: 0 0
Oct 10 09:47:31 compute-2 systemd[1]: libpod-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope: Deactivated successfully.
Oct 10 09:47:31 compute-2 conmon[85679]: conmon cae100c2e8df9845474a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope/container/memory.events
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.325737403 +0000 UTC m=+3.215708744 container died cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, release=1793, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.buildah.version=1.28.2, architecture=x86_64, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Oct 10 09:47:31 compute-2 systemd[1]: var-lib-containers-storage-overlay-5670ca1ecf11d55c7732da222050b203d0f5568cb8345c4225ef46ef2a78289c-merged.mount: Deactivated successfully.
Oct 10 09:47:31 compute-2 podman[85580]: 2025-10-10 09:47:31.365690558 +0000 UTC m=+3.255661899 container remove cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., distribution-scope=public, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, version=2.2.4)
Oct 10 09:47:31 compute-2 systemd[1]: libpod-conmon-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope: Deactivated successfully.
Oct 10 09:47:31 compute-2 systemd[1]: Reloading.
Oct 10 09:47:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:31 compute-2 systemd-sysv-generator[85730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:31 compute-2 systemd-rc-local-generator[85726]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:31 compute-2 systemd[1]: Reloading.
Oct 10 09:47:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:31 compute-2 systemd-sysv-generator[85769]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:31 compute-2 systemd-rc-local-generator[85763]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:32 compute-2 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.fcbgvm for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:32 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:32 compute-2 podman[85824]: 2025-10-10 09:47:32.370524911 +0000 UTC m=+0.043128430 container create 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.openshift.expose-services=)
Oct 10 09:47:32 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09511f2789405164399a7808eeeb44e6de8e40c54e9fca8640fe041738223d0f/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:32 compute-2 podman[85824]: 2025-10-10 09:47:32.43714988 +0000 UTC m=+0.109753429 container init 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, version=2.2.4, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, io.buildah.version=1.28.2, release=1793, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 09:47:32 compute-2 podman[85824]: 2025-10-10 09:47:32.442361734 +0000 UTC m=+0.114965273 container start 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, architecture=x86_64, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 10 09:47:32 compute-2 podman[85824]: 2025-10-10 09:47:32.354183749 +0000 UTC m=+0.026787288 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Starting Keepalived v2.2.4 (08/21,2021)
Oct 10 09:47:32 compute-2 bash[85824]: 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Running on Linux 5.14.0-621.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025 (built for Linux 5.14.0)
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Configuration file /etc/keepalived/keepalived.conf
Oct 10 09:47:32 compute-2 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.fcbgvm for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Starting VRRP child process, pid=4
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Startup complete
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: (VI_0) Entering BACKUP STATE (init)
Oct 10 09:47:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: VRRP_Script(check_backend) succeeded
Oct 10 09:47:32 compute-2 sudo[85514]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:32 compute-2 ceph-mon[74913]: pgmap v45: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:47:32 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:32 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:32 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Oct 10 09:47:33 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 09:47:33 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 09:47:33 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 09:47:33 compute-2 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-1.twbftp on compute-1
Oct 10 09:47:33 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:34 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Oct 10 09:47:34 compute-2 ceph-mon[74913]: pgmap v46: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:47:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:34 compute-2 ceph-mon[74913]: osdmap e52: 3 total, 3 up, 3 in
Oct 10 09:47:34 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Oct 10 09:47:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:35 compute-2 ceph-mon[74913]: osdmap e53: 3 total, 3 up, 3 in
Oct 10 09:47:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:35 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:36 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:36 2025: (VI_0) Entering MASTER STATE
Oct 10 09:47:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Oct 10 09:47:36 compute-2 ceph-mon[74913]: pgmap v49: 167 pgs: 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Oct 10 09:47:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:36 compute-2 ceph-mon[74913]: osdmap e54: 3 total, 3 up, 3 in
Oct 10 09:47:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:36 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:36 compute-2 ceph-mon[74913]: osdmap e55: 3 total, 3 up, 3 in
Oct 10 09:47:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:37 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:37 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:37 compute-2 ceph-mon[74913]: osdmap e56: 3 total, 3 up, 3 in
Oct 10 09:47:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:38 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:38 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 09:47:38 compute-2 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-0.mciijj on compute-0
Oct 10 09:47:38 compute-2 ceph-mon[74913]: pgmap v52: 229 pgs: 62 unknown, 167 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 8.6 deep-scrub starts
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 8.6 deep-scrub ok
Oct 10 09:47:38 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 7.1c scrub starts
Oct 10 09:47:38 compute-2 ceph-mon[74913]: 7.1c scrub ok
Oct 10 09:47:38 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Oct 10 09:47:38 compute-2 ceph-mon[74913]: osdmap e57: 3 total, 3 up, 3 in
Oct 10 09:47:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Oct 10 09:47:39 compute-2 ceph-mon[74913]: 8.10 scrub starts
Oct 10 09:47:39 compute-2 ceph-mon[74913]: 8.10 scrub ok
Oct 10 09:47:39 compute-2 ceph-mon[74913]: 7.1f scrub starts
Oct 10 09:47:39 compute-2 ceph-mon[74913]: 7.1f scrub ok
Oct 10 09:47:39 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:39 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:40 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Oct 10 09:47:40 compute-2 ceph-mon[74913]: pgmap v55: 291 pgs: 62 unknown, 229 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:47:40 compute-2 ceph-mon[74913]: 8.11 scrub starts
Oct 10 09:47:40 compute-2 ceph-mon[74913]: 8.11 scrub ok
Oct 10 09:47:40 compute-2 ceph-mon[74913]: 7.1d scrub starts
Oct 10 09:47:40 compute-2 ceph-mon[74913]: 7.1d scrub ok
Oct 10 09:47:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:40 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 09:47:40 compute-2 ceph-mon[74913]: osdmap e58: 3 total, 3 up, 3 in
Oct 10 09:47:40 compute-2 ceph-mon[74913]: osdmap e59: 3 total, 3 up, 3 in
Oct 10 09:47:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:40 2025: (VI_0) Received advert from 192.168.122.101 with lower priority 90, ours 90, forcing new election
Oct 10 09:47:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:41 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Oct 10 09:47:41 compute-2 ceph-mon[74913]: 8.14 scrub starts
Oct 10 09:47:41 compute-2 ceph-mon[74913]: 8.14 scrub ok
Oct 10 09:47:41 compute-2 ceph-mon[74913]: 7.12 scrub starts
Oct 10 09:47:41 compute-2 ceph-mon[74913]: 7.12 scrub ok
Oct 10 09:47:41 compute-2 ceph-mon[74913]: osdmap e60: 3 total, 3 up, 3 in
Oct 10 09:47:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:42 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:43 compute-2 ceph-mon[74913]: pgmap v58: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:47:43 compute-2 ceph-mon[74913]: 8.15 scrub starts
Oct 10 09:47:43 compute-2 ceph-mon[74913]: 8.15 scrub ok
Oct 10 09:47:43 compute-2 ceph-mon[74913]: 7.a scrub starts
Oct 10 09:47:43 compute-2 ceph-mon[74913]: 7.a scrub ok
Oct 10 09:47:43 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:43 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:43 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:43 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:44 compute-2 ceph-mon[74913]: Deploying daemon alertmanager.compute-0 on compute-0
Oct 10 09:47:44 compute-2 ceph-mon[74913]: 8.3 scrub starts
Oct 10 09:47:44 compute-2 ceph-mon[74913]: 8.3 scrub ok
Oct 10 09:47:44 compute-2 ceph-mon[74913]: 7.13 scrub starts
Oct 10 09:47:44 compute-2 ceph-mon[74913]: 7.13 scrub ok
Oct 10 09:47:44 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:44 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:45 compute-2 ceph-mon[74913]: pgmap v60: 353 pgs: 62 unknown, 291 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:47:45 compute-2 ceph-mon[74913]: 8.17 scrub starts
Oct 10 09:47:45 compute-2 ceph-mon[74913]: 8.17 scrub ok
Oct 10 09:47:45 compute-2 ceph-mon[74913]: 7.11 scrub starts
Oct 10 09:47:45 compute-2 ceph-mon[74913]: 7.11 scrub ok
Oct 10 09:47:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:45 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Oct 10 09:47:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:45 2025: (VI_0) Entering BACKUP STATE
Oct 10 09:47:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.16( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.7( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.6( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.15( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.16( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.17( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.11( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.2( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.3( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.3( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.9( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.9( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.b( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.8( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.a( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.d( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.c( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.b( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.5( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:46 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.5( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.f( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.18( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.1f( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.1d( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.1c( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.13( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.11( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.13( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-mon[74913]: 8.8 scrub starts
Oct 10 09:47:46 compute-2 ceph-mon[74913]: 8.8 scrub ok
Oct 10 09:47:46 compute-2 ceph-mon[74913]: 7.16 deep-scrub starts
Oct 10 09:47:46 compute-2 ceph-mon[74913]: 7.16 deep-scrub ok
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:46 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.7( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.4( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.9( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.5( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.2( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.3( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.14( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1a( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.11( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.18( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.1d( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.a( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1e( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1d( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.17( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.16( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.1f( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.13( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.9( v 60'1 lc 0'0 (0'0,60'1] local-lis/les=61/62 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.7( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.4( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.11( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.14( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.16( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1d( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.18( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.5( v 44'6 (0'0,44'6] local-lis/les=61/62 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.a( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.9( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=61/62 n=1 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.2( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.5( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.7( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.e( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.3( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=1 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1e( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=61/62 n=1 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.d( v 51'44 lc 51'19 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.16( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.b( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.1f( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=61/62 n=1 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.3( v 44'6 (0'0,44'6] local-lis/les=61/62 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.8( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.17( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.16( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1a( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.1d( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.18( v 60'1 lc 0'0 (0'0,60'1] local-lis/les=61/62 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.17( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.11( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.1d( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.3( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.13( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 8.f deep-scrub starts
Oct 10 09:47:47 compute-2 ceph-mon[74913]: pgmap v61: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 2 op/s
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 8.f deep-scrub ok
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 7.15 scrub starts
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 7.15 scrub ok
Oct 10 09:47:47 compute-2 ceph-mon[74913]: Regenerating cephadm self-signed grafana TLS certificates
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:47:47 compute-2 ceph-mon[74913]: osdmap e61: 3 total, 3 up, 3 in
Oct 10 09:47:47 compute-2 ceph-mon[74913]: Deploying daemon grafana.compute-0 on compute-0
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 7.1a scrub starts
Oct 10 09:47:47 compute-2 ceph-mon[74913]: 7.1a scrub ok
Oct 10 09:47:47 compute-2 ceph-mon[74913]: osdmap e62: 3 total, 3 up, 3 in
Oct 10 09:47:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:47 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.13 deep-scrub starts
Oct 10 09:47:47 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.13 deep-scrub ok
Oct 10 09:47:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:48 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Oct 10 09:47:48 compute-2 ceph-mon[74913]: 9.14 scrub starts
Oct 10 09:47:48 compute-2 ceph-mon[74913]: 9.14 scrub ok
Oct 10 09:47:48 compute-2 ceph-mon[74913]: pgmap v64: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 2 op/s
Oct 10 09:47:48 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Oct 10 09:47:48 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Oct 10 09:47:48 compute-2 ceph-mon[74913]: osdmap e63: 3 total, 3 up, 3 in
Oct 10 09:47:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Oct 10 09:47:48 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:48 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:48 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:48 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=59'1094 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=59'1094 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:49 compute-2 ceph-mon[74913]: 11.15 scrub starts
Oct 10 09:47:49 compute-2 ceph-mon[74913]: 12.13 deep-scrub starts
Oct 10 09:47:49 compute-2 ceph-mon[74913]: 11.15 scrub ok
Oct 10 09:47:49 compute-2 ceph-mon[74913]: 12.13 deep-scrub ok
Oct 10 09:47:49 compute-2 ceph-mon[74913]: osdmap e64: 3 total, 3 up, 3 in
Oct 10 09:47:49 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:49 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:49 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Oct 10 09:47:49 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Oct 10 09:47:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:50 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-mon[74913]: 9.2 scrub starts
Oct 10 09:47:50 compute-2 ceph-mon[74913]: 9.2 scrub ok
Oct 10 09:47:50 compute-2 ceph-mon[74913]: pgmap v67: 353 pgs: 2 peering, 351 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 102 B/s, 4 objects/s recovering
Oct 10 09:47:50 compute-2 ceph-mon[74913]: 11.0 scrub starts
Oct 10 09:47:50 compute-2 ceph-mon[74913]: 11.0 scrub ok
Oct 10 09:47:50 compute-2 ceph-mon[74913]: osdmap e65: 3 total, 3 up, 3 in
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=64'1098 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:47:50 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1 deep-scrub starts
Oct 10 09:47:50 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1 deep-scrub ok
Oct 10 09:47:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:51 compute-2 ceph-mon[74913]: 10.17 scrub starts
Oct 10 09:47:51 compute-2 ceph-mon[74913]: 10.17 scrub ok
Oct 10 09:47:51 compute-2 ceph-mon[74913]: 11.c scrub starts
Oct 10 09:47:51 compute-2 ceph-mon[74913]: 11.c scrub ok
Oct 10 09:47:51 compute-2 ceph-mon[74913]: osdmap e66: 3 total, 3 up, 3 in
Oct 10 09:47:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Oct 10 09:47:51 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Oct 10 09:47:51 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Oct 10 09:47:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:52 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:52 compute-2 ceph-mon[74913]: 10.1 deep-scrub starts
Oct 10 09:47:52 compute-2 ceph-mon[74913]: 10.1 deep-scrub ok
Oct 10 09:47:52 compute-2 ceph-mon[74913]: pgmap v70: 353 pgs: 8 remapped+peering, 14 active+remapped, 2 peering, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 806 B/s, 25 objects/s recovering
Oct 10 09:47:52 compute-2 ceph-mon[74913]: 11.b deep-scrub starts
Oct 10 09:47:52 compute-2 ceph-mon[74913]: 11.b deep-scrub ok
Oct 10 09:47:52 compute-2 ceph-mon[74913]: osdmap e67: 3 total, 3 up, 3 in
Oct 10 09:47:52 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Oct 10 09:47:52 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Oct 10 09:47:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:53 compute-2 ceph-mon[74913]: 10.1f scrub starts
Oct 10 09:47:53 compute-2 ceph-mon[74913]: 10.1f scrub ok
Oct 10 09:47:53 compute-2 ceph-mon[74913]: 11.9 deep-scrub starts
Oct 10 09:47:53 compute-2 ceph-mon[74913]: 11.9 deep-scrub ok
Oct 10 09:47:53 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Oct 10 09:47:53 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Oct 10 09:47:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:54 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938000d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:54 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.7 deep-scrub starts
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 10.7 scrub starts
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 10.7 scrub ok
Oct 10 09:47:54 compute-2 ceph-mon[74913]: pgmap v72: 353 pgs: 8 remapped+peering, 14 active+remapped, 2 peering, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 632 B/s, 20 objects/s recovering
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 11.d deep-scrub starts
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 11.d deep-scrub ok
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 7.19 scrub starts
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 7.19 scrub ok
Oct 10 09:47:54 compute-2 ceph-mon[74913]: 10.1b scrub starts
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:54 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.7 deep-scrub ok
Oct 10 09:47:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.931797) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674931865, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1078, "num_deletes": 251, "total_data_size": 1850155, "memory_usage": 1888160, "flush_reason": "Manual Compaction"}
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674945966, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1187808, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6529, "largest_seqno": 7602, "table_properties": {"data_size": 1182537, "index_size": 2603, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13502, "raw_average_key_size": 21, "raw_value_size": 1171020, "raw_average_value_size": 1847, "num_data_blocks": 115, "num_entries": 634, "num_filter_entries": 634, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089648, "oldest_key_time": 1760089648, "file_creation_time": 1760089674, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 14225 microseconds, and 8607 cpu microseconds.
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.946017) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1187808 bytes OK
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.946038) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947184) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947201) EVENT_LOG_v1 {"time_micros": 1760089674947195, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947221) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1844398, prev total WAL file size 1844398, number of live WAL files 2.
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947761) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1159KB)], [15(10MB)]
Oct 10 09:47:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674947792, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12570893, "oldest_snapshot_seqno": -1}
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3077 keys, 11334814 bytes, temperature: kUnknown
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675013149, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11334814, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11310343, "index_size": 15658, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7749, "raw_key_size": 79218, "raw_average_key_size": 25, "raw_value_size": 11249777, "raw_average_value_size": 3656, "num_data_blocks": 685, "num_entries": 3077, "num_filter_entries": 3077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089674, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.013397) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11334814 bytes
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.014901) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.1 rd, 173.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.9 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(20.1) write-amplify(9.5) OK, records in: 3604, records dropped: 527 output_compression: NoCompression
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.014924) EVENT_LOG_v1 {"time_micros": 1760089675014914, "job": 6, "event": "compaction_finished", "compaction_time_micros": 65433, "compaction_time_cpu_micros": 21720, "output_level": 6, "num_output_files": 1, "total_output_size": 11334814, "num_input_records": 3604, "num_output_records": 3077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675015386, "job": 6, "event": "table_file_deletion", "file_number": 17}
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675017929, "job": 6, "event": "table_file_deletion", "file_number": 15}
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:47:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:55 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Oct 10 09:47:55 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Oct 10 09:47:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:56 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 10.1b scrub ok
Oct 10 09:47:56 compute-2 ceph-mon[74913]: Deploying daemon haproxy.rgw.default.compute-0.ofnenu on compute-0
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 8.e scrub starts
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 8.e scrub ok
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 12.15 scrub starts
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 12.15 scrub ok
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 12.7 deep-scrub starts
Oct 10 09:47:56 compute-2 ceph-mon[74913]: 12.7 deep-scrub ok
Oct 10 09:47:56 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Oct 10 09:47:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Oct 10 09:47:56 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.c scrub starts
Oct 10 09:47:56 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.c scrub ok
Oct 10 09:47:57 compute-2 ceph-mon[74913]: pgmap v73: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 469 B/s, 17 objects/s recovering
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 9.c scrub starts
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 9.c scrub ok
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 7.c scrub starts
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 7.c scrub ok
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 12.4 scrub starts
Oct 10 09:47:57 compute-2 ceph-mon[74913]: 12.4 scrub ok
Oct 10 09:47:57 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Oct 10 09:47:57 compute-2 ceph-mon[74913]: osdmap e68: 3 total, 3 up, 3 in
Oct 10 09:47:57 compute-2 sudo[85866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:47:57 compute-2 sudo[85866]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:57 compute-2 sudo[85866]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:57 compute-2 sudo[85891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:47:57 compute-2 sudo[85891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.824801574 +0000 UTC m=+0.042896108 container create ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 systemd[1]: Started libpod-conmon-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope.
Oct 10 09:47:57 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:47:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.807659868 +0000 UTC m=+0.025754402 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.906228078 +0000 UTC m=+0.124322612 container init ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.912541849 +0000 UTC m=+0.130636363 container start ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.915412991 +0000 UTC m=+0.133507525 container attach ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 gracious_wiles[85974]: 0 0
Oct 10 09:47:57 compute-2 systemd[1]: libpod-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope: Deactivated successfully.
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.91820473 +0000 UTC m=+0.136299254 container died ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 systemd[1]: var-lib-containers-storage-overlay-5552605a6ad55e4d529a53d7f2fcdf2f58cfb82d594f5d0047a2b5e5948df48e-merged.mount: Deactivated successfully.
Oct 10 09:47:57 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Oct 10 09:47:57 compute-2 podman[85957]: 2025-10-10 09:47:57.95087311 +0000 UTC m=+0.168967624 container remove ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 09:47:57 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Oct 10 09:47:57 compute-2 systemd[1]: libpod-conmon-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope: Deactivated successfully.
Oct 10 09:47:58 compute-2 systemd[1]: Reloading.
Oct 10 09:47:58 compute-2 systemd-rc-local-generator[86016]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:58 compute-2 systemd-sysv-generator[86020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:58 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 11.2 scrub starts
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 11.2 scrub ok
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 7.d scrub starts
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 7.d scrub ok
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 8.c scrub starts
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 8.c scrub ok
Oct 10 09:47:58 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:58 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:58 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:58 compute-2 ceph-mon[74913]: Deploying daemon haproxy.rgw.default.compute-2.mhdkdo on compute-2
Oct 10 09:47:58 compute-2 ceph-mon[74913]: pgmap v75: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 2 objects/s recovering
Oct 10 09:47:58 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 7.1 scrub starts
Oct 10 09:47:58 compute-2 ceph-mon[74913]: 7.1 scrub ok
Oct 10 09:47:58 compute-2 systemd[1]: Reloading.
Oct 10 09:47:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:47:58 compute-2 systemd-rc-local-generator[86061]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:58 compute-2 systemd-sysv-generator[86064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:47:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:47:58 compute-2 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.mhdkdo for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:47:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:47:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 09:47:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:47:58.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 09:47:58 compute-2 podman[86118]: 2025-10-10 09:47:58.813511421 +0000 UTC m=+0.042283238 container create 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 09:47:58 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b9a57660298361ca69f9d96249345bb33b06e12a121d055002216180d6c03a/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Oct 10 09:47:58 compute-2 podman[86118]: 2025-10-10 09:47:58.877776148 +0000 UTC m=+0.106547975 container init 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 09:47:58 compute-2 podman[86118]: 2025-10-10 09:47:58.887060645 +0000 UTC m=+0.115832422 container start 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 09:47:58 compute-2 bash[86118]: 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d
Oct 10 09:47:58 compute-2 podman[86118]: 2025-10-10 09:47:58.795653822 +0000 UTC m=+0.024425619 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 09:47:58 compute-2 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.mhdkdo for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:47:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo[86133]: [NOTICE] 282/094758 (2) : New worker #1 (4) forked
Oct 10 09:47:58 compute-2 sudo[85891]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:58 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Oct 10 09:47:58 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Oct 10 09:47:59 compute-2 sudo[86147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:47:59 compute-2 sudo[86147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:59 compute-2 sudo[86147]: pam_unix(sudo:session): session closed for user root
Oct 10 09:47:59 compute-2 sudo[86172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:47:59 compute-2 sudo[86172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 8.1 deep-scrub starts
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 8.1 deep-scrub ok
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 12.11 scrub starts
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 12.11 scrub ok
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Oct 10 09:47:59 compute-2 ceph-mon[74913]: osdmap e69: 3 total, 3 up, 3 in
Oct 10 09:47:59 compute-2 ceph-mon[74913]: osdmap e70: 3 total, 3 up, 3 in
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 7.7 scrub starts
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 7.7 scrub ok
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:59 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 09:47:59 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 09:47:59 compute-2 ceph-mon[74913]: Deploying daemon keepalived.rgw.default.compute-2.bbeizy on compute-2
Oct 10 09:47:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Oct 10 09:47:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.573145241 +0000 UTC m=+0.052414061 container create b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, vcs-type=git)
Oct 10 09:47:59 compute-2 systemd[1]: Started libpod-conmon-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope.
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.550110607 +0000 UTC m=+0.029379447 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 09:47:59 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.671769322 +0000 UTC m=+0.151038152 container init b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vendor=Red Hat, Inc.)
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.680865062 +0000 UTC m=+0.160133872 container start b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20)
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.683654181 +0000 UTC m=+0.162922981 container attach b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-type=git, distribution-scope=public, architecture=x86_64, release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vendor=Red Hat, Inc.)
Oct 10 09:47:59 compute-2 thirsty_chatterjee[86255]: 0 0
Oct 10 09:47:59 compute-2 systemd[1]: libpod-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope: Deactivated successfully.
Oct 10 09:47:59 compute-2 conmon[86255]: conmon b565333fd7652d84803b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope/container/memory.events
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.690639654 +0000 UTC m=+0.169908464 container died b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, vendor=Red Hat, Inc., architecture=x86_64, name=keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-type=git)
Oct 10 09:47:59 compute-2 systemd[1]: var-lib-containers-storage-overlay-e1938b87bc92c8c02f70d75105dec5219aa891e8473639f341cb105abb51494f-merged.mount: Deactivated successfully.
Oct 10 09:47:59 compute-2 podman[86238]: 2025-10-10 09:47:59.734452939 +0000 UTC m=+0.213721749 container remove b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, release=1793, vcs-type=git, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph)
Oct 10 09:47:59 compute-2 systemd[1]: libpod-conmon-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope: Deactivated successfully.
Oct 10 09:47:59 compute-2 systemd[1]: Reloading.
Oct 10 09:47:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:47:59 compute-2 systemd-rc-local-generator[86306]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:47:59 compute-2 systemd-sysv-generator[86310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:47:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:47:59 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Oct 10 09:47:59 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Oct 10 09:48:00 compute-2 systemd[1]: Reloading.
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:00 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:00 compute-2 systemd-rc-local-generator[86340]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:48:00 compute-2 systemd-sysv-generator[86347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:48:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:00.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:00 compute-2 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.bbeizy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:48:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Oct 10 09:48:00 compute-2 ceph-mon[74913]: 9.0 scrub starts
Oct 10 09:48:00 compute-2 ceph-mon[74913]: 9.0 scrub ok
Oct 10 09:48:00 compute-2 ceph-mon[74913]: 7.14 scrub starts
Oct 10 09:48:00 compute-2 ceph-mon[74913]: 7.14 scrub ok
Oct 10 09:48:00 compute-2 ceph-mon[74913]: osdmap e71: 3 total, 3 up, 3 in
Oct 10 09:48:00 compute-2 ceph-mon[74913]: pgmap v79: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:00 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=60'1098 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=60'1098 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.230051041s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965850830s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.230030060s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965850830s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.229329109s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965774536s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.229254723s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965774536s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228994370s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965759277s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228788376s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965759277s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228815079s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=66'1099 lcod 66'1100 mlcod 66'1100 active pruub 144.965835571s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228641510s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 unknown NOTIFY pruub 144.965835571s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:00 compute-2 podman[86404]: 2025-10-10 09:48:00.607283545 +0000 UTC m=+0.046825002 container create f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, build-date=2023-02-22T09:23:20, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, name=keepalived, release=1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, version=2.2.4, io.openshift.expose-services=)
Oct 10 09:48:00 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00956075af261ac334734a594a59fe9de93d7e170a73eb744c05481b26897625/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:48:00 compute-2 podman[86404]: 2025-10-10 09:48:00.590415228 +0000 UTC m=+0.029956705 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 09:48:00 compute-2 podman[86404]: 2025-10-10 09:48:00.687289834 +0000 UTC m=+0.126831341 container init f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived)
Oct 10 09:48:00 compute-2 podman[86404]: 2025-10-10 09:48:00.697179459 +0000 UTC m=+0.136720936 container start f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, release=1793)
Oct 10 09:48:00 compute-2 bash[86404]: f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654
Oct 10 09:48:00 compute-2 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.bbeizy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Starting Keepalived v2.2.4 (08/21,2021)
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Running on Linux 5.14.0-621.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025 (built for Linux 5.14.0)
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Configuration file /etc/keepalived/keepalived.conf
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Starting VRRP child process, pid=4
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Startup complete
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: (VI_0) Entering BACKUP STATE (init)
Oct 10 09:48:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: VRRP_Script(check_backend) succeeded
Oct 10 09:48:00 compute-2 sudo[86172]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:01 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Oct 10 09:48:01 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Oct 10 09:48:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 8.0 scrub starts
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 8.0 scrub ok
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 12.1d scrub starts
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 12.1d scrub ok
Oct 10 09:48:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Oct 10 09:48:01 compute-2 ceph-mon[74913]: osdmap e72: 3 total, 3 up, 3 in
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 10.0 scrub starts
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 10.0 scrub ok
Oct 10 09:48:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:01 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 09:48:01 compute-2 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 09:48:01 compute-2 ceph-mon[74913]: Deploying daemon keepalived.rgw.default.compute-0.igkrok on compute-0
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=72/73 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=71'1102 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:02 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Oct 10 09:48:02 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Oct 10 09:48:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:02 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:02.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 9.1 scrub starts
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 9.1 scrub ok
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 9.5 scrub starts
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 9.5 scrub ok
Oct 10 09:48:02 compute-2 ceph-mon[74913]: osdmap e73: 3 total, 3 up, 3 in
Oct 10 09:48:02 compute-2 ceph-mon[74913]: pgmap v82: 353 pgs: 1 active+remapped, 1 active+recovering+remapped, 2 active+recovery_wait+remapped, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 19/219 objects misplaced (8.676%); 0 B/s, 2 objects/s recovering
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 8.7 scrub starts
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 8.7 scrub ok
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 10.8 scrub starts
Oct 10 09:48:02 compute-2 ceph-mon[74913]: 10.8 scrub ok
Oct 10 09:48:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Oct 10 09:48:02 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=73/74 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=67'1101 lcod 66'1100 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:02 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:02 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:02 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:03 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Oct 10 09:48:03 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Oct 10 09:48:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.105600357s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798385620s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.105497360s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798385620s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104902267s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798400879s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104844093s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798400879s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104885101s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798400879s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.5( v 74'1104 (0'0,74'1104] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.100663185s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=67'1101 lcod 74'1103 mlcod 74'1103 active pruub 148.794448853s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.5( v 74'1104 (0'0,74'1104] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.100558281s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=67'1101 lcod 74'1103 mlcod 0'0 unknown NOTIFY pruub 148.794448853s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:03 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104282379s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798400879s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:03 compute-2 ceph-mon[74913]: 9.18 scrub starts
Oct 10 09:48:03 compute-2 ceph-mon[74913]: 9.18 scrub ok
Oct 10 09:48:03 compute-2 ceph-mon[74913]: 10.10 scrub starts
Oct 10 09:48:03 compute-2 ceph-mon[74913]: 10.10 scrub ok
Oct 10 09:48:03 compute-2 ceph-mon[74913]: osdmap e74: 3 total, 3 up, 3 in
Oct 10 09:48:03 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:03 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:03 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:03 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:03 compute-2 ceph-mon[74913]: osdmap e75: 3 total, 3 up, 3 in
Oct 10 09:48:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:04 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Oct 10 09:48:04 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Oct 10 09:48:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:04 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:04 2025: (VI_0) Entering MASTER STATE
Oct 10 09:48:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Oct 10 09:48:04 compute-2 ceph-mon[74913]: 11.19 scrub starts
Oct 10 09:48:04 compute-2 ceph-mon[74913]: 11.19 scrub ok
Oct 10 09:48:04 compute-2 ceph-mon[74913]: Deploying daemon prometheus.compute-0 on compute-0
Oct 10 09:48:04 compute-2 ceph-mon[74913]: 10.18 scrub starts
Oct 10 09:48:04 compute-2 ceph-mon[74913]: 10.18 scrub ok
Oct 10 09:48:04 compute-2 ceph-mon[74913]: pgmap v85: 353 pgs: 1 active+remapped, 1 active+recovering+remapped, 2 active+recovery_wait+remapped, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 19/219 objects misplaced (8.676%); 0 B/s, 1 objects/s recovering
Oct 10 09:48:04 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:04 compute-2 ceph-mon[74913]: osdmap e76: 3 total, 3 up, 3 in
Oct 10 09:48:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:04.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:04 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Oct 10 09:48:05 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Oct 10 09:48:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Oct 10 09:48:05 compute-2 ceph-mon[74913]: 8.5 scrub starts
Oct 10 09:48:05 compute-2 ceph-mon[74913]: 8.5 scrub ok
Oct 10 09:48:05 compute-2 ceph-mon[74913]: 10.15 scrub starts
Oct 10 09:48:05 compute-2 ceph-mon[74913]: 10.15 scrub ok
Oct 10 09:48:05 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Oct 10 09:48:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:05 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Oct 10 09:48:06 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Oct 10 09:48:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:06 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:06.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Oct 10 09:48:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) Entering BACKUP STATE
Oct 10 09:48:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 09:48:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:06.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 9.9 scrub starts
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 9.9 scrub ok
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 12.f deep-scrub starts
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 12.f deep-scrub ok
Oct 10 09:48:06 compute-2 ceph-mon[74913]: pgmap v87: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 262 B/s, 10 objects/s recovering
Oct 10 09:48:06 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Oct 10 09:48:06 compute-2 ceph-mon[74913]: osdmap e77: 3 total, 3 up, 3 in
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 10.d deep-scrub starts
Oct 10 09:48:06 compute-2 ceph-mon[74913]: 10.d deep-scrub ok
Oct 10 09:48:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Oct 10 09:48:06 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.2 deep-scrub starts
Oct 10 09:48:06 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.2 deep-scrub ok
Oct 10 09:48:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 7.5 scrub starts
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 7.5 scrub ok
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 7.0 scrub starts
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 7.0 scrub ok
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 10.12 scrub starts
Oct 10 09:48:07 compute-2 ceph-mon[74913]: osdmap e78: 3 total, 3 up, 3 in
Oct 10 09:48:07 compute-2 ceph-mon[74913]: 10.12 scrub ok
Oct 10 09:48:07 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Oct 10 09:48:07 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.071848869s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.962234497s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.071782112s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.962234497s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074589729s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.965835571s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074548721s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.965835571s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=79 pruub=14.043064117s) [1] r=-1 lpr=79 pi=[64,79)/1 crt=51'1091 mlcod 0'0 active pruub 151.934539795s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=79 pruub=14.042997360s) [1] r=-1 lpr=79 pi=[64,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 151.934539795s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074128151s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.965805054s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:07 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.073760033s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.965805054s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:07 compute-2 sshd-session[86433]: Accepted publickey for zuul from 192.168.122.30 port 49904 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:48:07 compute-2 systemd-logind[796]: New session 37 of user zuul.
Oct 10 09:48:07 compute-2 systemd[1]: Started Session 37 of User zuul.
Oct 10 09:48:07 compute-2 sshd-session[86433]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:48:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:07 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.7 deep-scrub starts
Oct 10 09:48:07 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.7 deep-scrub ok
Oct 10 09:48:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:08 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:08.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:08 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 12.2 deep-scrub starts
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 12.2 deep-scrub ok
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 12.d scrub starts
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 12.d scrub ok
Oct 10 09:48:08 compute-2 ceph-mon[74913]: pgmap v90: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 258 B/s, 10 objects/s recovering
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 9.4 scrub starts
Oct 10 09:48:08 compute-2 ceph-mon[74913]: 9.4 scrub ok
Oct 10 09:48:08 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Oct 10 09:48:08 compute-2 ceph-mon[74913]: osdmap e79: 3 total, 3 up, 3 in
Oct 10 09:48:08 compute-2 ceph-mon[74913]: osdmap e80: 3 total, 3 up, 3 in
Oct 10 09:48:08 compute-2 python3.9[86587]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:48:08 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.b scrub starts
Oct 10 09:48:08 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.b scrub ok
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Oct 10 09:48:09 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:09 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:09 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:09 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 9.7 deep-scrub starts
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 9.7 deep-scrub ok
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 12.5 scrub starts
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 12.5 scrub ok
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 11.18 deep-scrub starts
Oct 10 09:48:09 compute-2 ceph-mon[74913]: 11.18 deep-scrub ok
Oct 10 09:48:09 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:09 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:09 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:09 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Oct 10 09:48:09 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:09 compute-2 ceph-mon[74913]: osdmap e81: 3 total, 3 up, 3 in
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 09:48:09 compute-2 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 09:48:09 compute-2 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 09:48:09 compute-2 sshd-session[82055]: Connection closed by 192.168.122.100 port 55678
Oct 10 09:48:09 compute-2 sshd-session[82037]: pam_unix(sshd:session): session closed for user ceph-admin
Oct 10 09:48:09 compute-2 systemd[1]: session-35.scope: Deactivated successfully.
Oct 10 09:48:09 compute-2 systemd[1]: session-35.scope: Consumed 21.211s CPU time.
Oct 10 09:48:09 compute-2 systemd-logind[796]: Session 35 logged out. Waiting for processes to exit.
Oct 10 09:48:09 compute-2 systemd-logind[796]: Removed session 35.
Oct 10 09:48:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 09:48:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:09.935+0000 7f6000674140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:48:09 compute-2 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 09:48:09 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 09:48:09 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Oct 10 09:48:09 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Oct 10 09:48:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:10.013+0000 7f6000674140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:48:10 compute-2 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 09:48:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 09:48:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:10 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:10.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981945038s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713287354s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981849670s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713287354s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981137276s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713256836s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980942726s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713256836s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980880737s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713348389s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980735779s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713348389s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=80/64 les/c/f=81/65/0 sis=82 pruub=14.980568886s) [1] async=[1] r=-1 lpr=82 pi=[64,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713226318s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:10 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=80/64 les/c/f=81/65/0 sis=82 pruub=14.980377197s) [1] r=-1 lpr=82 pi=[64,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713226318s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:10 compute-2 sudo[86832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xewzftdxtwazhrnredlgdcihrlfucyxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089690.1340065-59-167974508814185/AnsiballZ_command.py'
Oct 10 09:48:10 compute-2 sudo[86832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:10.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 8.b scrub starts
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 8.b scrub ok
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 12.0 scrub starts
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 12.0 scrub ok
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 9.1a scrub starts
Oct 10 09:48:10 compute-2 ceph-mon[74913]: 9.1a scrub ok
Oct 10 09:48:10 compute-2 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Oct 10 09:48:10 compute-2 ceph-mon[74913]: mgrmap e28: compute-0.xkdepb(active, since 96s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:48:10 compute-2 ceph-mon[74913]: osdmap e82: 3 total, 3 up, 3 in
Oct 10 09:48:10 compute-2 python3.9[86834]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:48:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 09:48:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:10.821+0000 7f6000674140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:48:10 compute-2 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 09:48:10 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 09:48:10 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Oct 10 09:48:10 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.422+0000 7f6000674140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:   from numpy import show_config as show_numpy_config
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.578+0000 7f6000674140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.645+0000 7f6000674140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 12.1e scrub starts
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 12.1e scrub ok
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 12.1f scrub starts
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 12.1f scrub ok
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 9.1b deep-scrub starts
Oct 10 09:48:11 compute-2 ceph-mon[74913]: 9.1b deep-scrub ok
Oct 10 09:48:11 compute-2 ceph-mon[74913]: osdmap e83: 3 total, 3 up, 3 in
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.785+0000 7f6000674140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 09:48:11 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 09:48:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:11 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.a scrub starts
Oct 10 09:48:11 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.a scrub ok
Oct 10 09:48:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:12 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 09:48:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:12.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 09:48:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 09:48:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 11.8 scrub starts
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 11.8 scrub ok
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 7.17 scrub starts
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 8.1a scrub starts
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 7.17 scrub ok
Oct 10 09:48:12 compute-2 ceph-mon[74913]: 8.1a scrub ok
Oct 10 09:48:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:12.800+0000 7f6000674140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 09:48:12 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 09:48:13 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Oct 10 09:48:13 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.017+0000 7f6000674140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.092+0000 7f6000674140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.157+0000 7f6000674140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.232+0000 7f6000674140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.304+0000 7f6000674140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.642+0000 7f6000674140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.734+0000 7f6000674140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 8.a scrub starts
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 8.a scrub ok
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 12.1b scrub starts
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 12.1b scrub ok
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 11.6 scrub starts
Oct 10 09:48:13 compute-2 ceph-mon[74913]: 11.6 scrub ok
Oct 10 09:48:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:13 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 09:48:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.b deep-scrub starts
Oct 10 09:48:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.b deep-scrub ok
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:14 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.174+0000 7f6000674140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 09:48:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:14.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 9.16 scrub starts
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 9.16 scrub ok
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 12.16 deep-scrub starts
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 9.19 scrub starts
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 12.16 deep-scrub ok
Oct 10 09:48:14 compute-2 ceph-mon[74913]: 9.19 scrub ok
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.766+0000 7f6000674140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.838+0000 7f6000674140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 09:48:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.919+0000 7f6000674140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 09:48:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Oct 10 09:48:14 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Oct 10 09:48:14 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.076+0000 7f6000674140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.146+0000 7f6000674140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.299+0000 7f6000674140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.527+0000 7f6000674140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 9.b deep-scrub starts
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 9.b deep-scrub ok
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 12.14 scrub starts
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 9.1e scrub starts
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 12.14 scrub ok
Oct 10 09:48:15 compute-2 ceph-mon[74913]: 9.1e scrub ok
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.800+0000 7f6000674140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.871+0000 7f6000674140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56163567b860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: mgr load Constructed class from module: prometheus
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO root] server_addr: :: server_port: 9283
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO root] Starting engine...
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Bus STARTING
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Bus STARTING
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: CherryPy Checker:
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: The Application mounted at '' has an empty config.
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:15 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.a scrub starts
Oct 10 09:48:15 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.a scrub ok
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Serving on http://:::9283
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Serving on http://:::9283
Oct 10 09:48:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Bus STARTED
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Bus STARTED
Oct 10 09:48:15 compute-2 ceph-mgr[75218]: [prometheus INFO root] Engine started.
Oct 10 09:48:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:16 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:16 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Oct 10 09:48:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:16 compute-2 sshd-session[86893]: Accepted publickey for ceph-admin from 192.168.122.100 port 57350 ssh2: RSA SHA256:iFwOnwcB2x2Q1gpAWZobZa2jCZZy75CuUHv4ViVnHA0
Oct 10 09:48:16 compute-2 systemd-logind[796]: New session 38 of user ceph-admin.
Oct 10 09:48:16 compute-2 systemd[1]: Started Session 38 of User ceph-admin.
Oct 10 09:48:16 compute-2 sshd-session[86893]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Oct 10 09:48:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:16.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:16 compute-2 sudo[86900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:48:16 compute-2 sudo[86900]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:16 compute-2 sudo[86900]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 11.13 scrub starts
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 11.13 scrub ok
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 9.1f scrub starts
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 9.1f scrub ok
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 12.1 scrub starts
Oct 10 09:48:16 compute-2 ceph-mon[74913]: 12.1 scrub ok
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc restarted
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Standby manager daemon compute-1.rfugxc started
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp restarted
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Standby manager daemon compute-2.gkrssp started
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 09:48:16 compute-2 ceph-mon[74913]: osdmap e84: 3 total, 3 up, 3 in
Oct 10 09:48:16 compute-2 ceph-mon[74913]: mgrmap e29: compute-0.xkdepb(active, starting, since 0.0333518s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.cchwlo"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.fhagzt"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.vlgajy"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-0.xkdepb", "id": "compute-0.xkdepb"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-1.rfugxc", "id": "compute-1.rfugxc"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr metadata", "who": "compute-2.gkrssp", "id": "compute-2.gkrssp"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mds metadata"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mon metadata"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 09:48:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 09:48:16 compute-2 sudo[86926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:48:16 compute-2 sudo[86926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:17 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Oct 10 09:48:17 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Oct 10 09:48:17 compute-2 sudo[86832]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:17 compute-2 podman[87045]: 2025-10-10 09:48:17.364139558 +0000 UTC m=+0.052877455 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Oct 10 09:48:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:17 compute-2 podman[87045]: 2025-10-10 09:48:17.450110907 +0000 UTC m=+0.138848804 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 09:48:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:17 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 11.a scrub starts
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 11.a scrub ok
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 8.1e scrub starts
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 8.1e scrub ok
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 9.15 scrub starts
Oct 10 09:48:17 compute-2 ceph-mon[74913]: 9.15 scrub ok
Oct 10 09:48:17 compute-2 ceph-mon[74913]: mgrmap e30: compute-0.xkdepb(active, since 1.06227s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:48:17 compute-2 podman[87163]: 2025-10-10 09:48:17.883889696 +0000 UTC m=+0.082478379 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:48:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:17 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:17 compute-2 podman[87163]: 2025-10-10 09:48:17.916152293 +0000 UTC m=+0.114740926 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:48:17 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Oct 10 09:48:18 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Oct 10 09:48:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:18 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:18 compute-2 podman[87254]: 2025-10-10 09:48:18.184985297 +0000 UTC m=+0.046844393 container exec c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1)
Oct 10 09:48:18 compute-2 podman[87254]: 2025-10-10 09:48:18.191378421 +0000 UTC m=+0.053237427 container exec_died c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:48:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:18.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:18 compute-2 sshd-session[86436]: Connection closed by 192.168.122.30 port 49904
Oct 10 09:48:18 compute-2 sshd-session[86433]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:48:18 compute-2 systemd[1]: session-37.scope: Deactivated successfully.
Oct 10 09:48:18 compute-2 systemd[1]: session-37.scope: Consumed 7.918s CPU time.
Oct 10 09:48:18 compute-2 systemd-logind[796]: Session 37 logged out. Waiting for processes to exit.
Oct 10 09:48:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:18 compute-2 systemd-logind[796]: Removed session 37.
Oct 10 09:48:18 compute-2 podman[87322]: 2025-10-10 09:48:18.378374448 +0000 UTC m=+0.053643029 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:48:18 compute-2 podman[87322]: 2025-10-10 09:48:18.385244318 +0000 UTC m=+0.060512879 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:48:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:18 compute-2 podman[87385]: 2025-10-10 09:48:18.594761122 +0000 UTC m=+0.067036037 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.openshift.expose-services=, description=keepalived for Ceph, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, io.buildah.version=1.28.2)
Oct 10 09:48:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:18.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:18 compute-2 podman[87385]: 2025-10-10 09:48:18.609203232 +0000 UTC m=+0.081478147 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.expose-services=, name=keepalived)
Oct 10 09:48:18 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Oct 10 09:48:18 compute-2 sudo[86926]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.3 scrub starts
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.3 scrub ok
Oct 10 09:48:18 compute-2 ceph-mon[74913]: pgmap v3: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.1c scrub starts
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.1c scrub ok
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.10 scrub starts
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 9.10 scrub ok
Oct 10 09:48:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Oct 10 09:48:18 compute-2 ceph-mon[74913]: 11.1f scrub starts
Oct 10 09:48:18 compute-2 sudo[87453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:48:18 compute-2 sudo[87453]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:18 compute-2 sudo[87453]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:18 compute-2 sudo[87478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:48:18 compute-2 sudo[87478]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:18 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Oct 10 09:48:19 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Oct 10 09:48:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:19 compute-2 sudo[87478]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:19 compute-2 sudo[87534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:48:19 compute-2 sudo[87534]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:19 compute-2 sudo[87534]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:19 compute-2 sudo[87559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Oct 10 09:48:19 compute-2 sudo[87559]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Oct 10 09:48:19 compute-2 ceph-mon[74913]: [10/Oct/2025:09:48:17] ENGINE Bus STARTING
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 8.9 scrub starts
Oct 10 09:48:19 compute-2 ceph-mon[74913]: [10/Oct/2025:09:48:17] ENGINE Serving on http://192.168.122.100:8765
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 8.9 scrub ok
Oct 10 09:48:19 compute-2 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Serving on https://192.168.122.100:7150
Oct 10 09:48:19 compute-2 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Bus STARTED
Oct 10 09:48:19 compute-2 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Client ('192.168.122.100', 53560) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 09:48:19 compute-2 ceph-mon[74913]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 11.1f scrub ok
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 11.12 scrub starts
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 11.12 scrub ok
Oct 10 09:48:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Oct 10 09:48:19 compute-2 ceph-mon[74913]: osdmap e85: 3 total, 3 up, 3 in
Oct 10 09:48:19 compute-2 ceph-mon[74913]: mgrmap e31: compute-0.xkdepb(active, since 2s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:48:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 11.10 scrub starts
Oct 10 09:48:19 compute-2 ceph-mon[74913]: 11.10 scrub ok
Oct 10 09:48:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:19 compute-2 sudo[87559]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:19 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Oct 10 09:48:19 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Oct 10 09:48:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:20 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:20.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Oct 10 09:48:20 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.883892059s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 active pruub 160.972961426s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:20 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.883842468s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 160.972961426s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:20 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.875918388s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 active pruub 160.966171265s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:20 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.875630379s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 160.966171265s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 9.8 scrub starts
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 9.8 scrub ok
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 11.1 scrub starts
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 11.1 scrub ok
Oct 10 09:48:20 compute-2 ceph-mon[74913]: osdmap e86: 3 total, 3 up, 3 in
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 8.2 scrub starts
Oct 10 09:48:20 compute-2 ceph-mon[74913]: 8.2 scrub ok
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:20 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Oct 10 09:48:20 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Oct 10 09:48:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:21 compute-2 sudo[87605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:48:21 compute-2 sudo[87605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:21 compute-2 sudo[87605]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:21 compute-2 sudo[87630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:48:21 compute-2 sudo[87630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:21 compute-2 sudo[87630]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:21 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Oct 10 09:48:21 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:21 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:21 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:21 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:21 compute-2 ceph-mon[74913]: pgmap v7: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 8.13 scrub starts
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 8.13 scrub ok
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 9.d deep-scrub starts
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 9.d deep-scrub ok
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Oct 10 09:48:21 compute-2 ceph-mon[74913]: osdmap e87: 3 total, 3 up, 3 in
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 9.17 scrub starts
Oct 10 09:48:21 compute-2 ceph-mon[74913]: 9.17 scrub ok
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:21 compute-2 ceph-mon[74913]: mgrmap e32: compute-0.xkdepb(active, since 4s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:48:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:48:21 compute-2 ceph-mon[74913]: osdmap e88: 3 total, 3 up, 3 in
Oct 10 09:48:21 compute-2 sudo[87655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:48:21 compute-2 sudo[87655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:21 compute-2 sudo[87655]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:21 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Oct 10 09:48:21 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Oct 10 09:48:21 compute-2 sudo[87680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:48:21 compute-2 sudo[87680]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:21 compute-2 sudo[87680]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:21 compute-2 sudo[87705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:48:21 compute-2 sudo[87705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:21 compute-2 sudo[87705]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[87753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[87753]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87753]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[87778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[87778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87778]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:22 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:22 compute-2 sudo[87803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Oct 10 09:48:22 compute-2 sudo[87803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87803]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[87828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:48:22 compute-2 sudo[87828]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:22.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:22 compute-2 sudo[87828]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[87853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:48:22 compute-2 sudo[87853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87853]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:22 compute-2 sudo[87878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[87878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87878]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:22 compute-2 sudo[87903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:48:22 compute-2 sudo[87903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87903]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[87928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[87928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87928]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:22.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:22 compute-2 sudo[87977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[87977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[87977]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[88002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new
Oct 10 09:48:22 compute-2 sudo[88002]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[88002]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[88028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:48:22 compute-2 sudo[88028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[88028]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Oct 10 09:48:22 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 89 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] async=[1] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:22 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 89 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] async=[1] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 11.11 deep-scrub starts
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 11.11 deep-scrub ok
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 11.f deep-scrub starts
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 11.f deep-scrub ok
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 8.1f scrub starts
Oct 10 09:48:22 compute-2 ceph-mon[74913]: 8.1f scrub ok
Oct 10 09:48:22 compute-2 ceph-mon[74913]: pgmap v10: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 09:48:22 compute-2 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:48:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Oct 10 09:48:22 compute-2 ceph-mon[74913]: osdmap e89: 3 total, 3 up, 3 in
Oct 10 09:48:22 compute-2 sudo[88053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Oct 10 09:48:22 compute-2 sudo[88053]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[88053]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:22 compute-2 sudo[88078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph
Oct 10 09:48:22 compute-2 sudo[88078]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:22 compute-2 sudo[88078]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88103]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:48:23 compute-2 sudo[88128]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88128]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88153]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88201]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88226]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:23 compute-2 sudo[88251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 sudo[88251]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:23 compute-2 sudo[88251]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Oct 10 09:48:23 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.410189629s) [1] async=[1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 51'1091 active pruub 169.115798950s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:23 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.410112381s) [1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 169.115798950s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:23 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.409692764s) [1] async=[1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 51'1091 active pruub 169.115676880s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:23 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.409649849s) [1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 169.115676880s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:23 compute-2 sudo[88276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:48:23 compute-2 sudo[88276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88276]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config
Oct 10 09:48:23 compute-2 sudo[88301]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88301]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:23 compute-2 sudo[88326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88326]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88326]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 09:48:23 compute-2 sudo[88352]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88352]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88377]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88425]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new
Oct 10 09:48:23 compute-2 sudo[88450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88450]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 sudo[88475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-21f084a3-af34-5230-afe4-ea5cd24a55f4/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring.new /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 sudo[88475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:23 compute-2 sudo[88475]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:23 compute-2 ceph-mon[74913]: 8.1d scrub starts
Oct 10 09:48:23 compute-2 ceph-mon[74913]: 8.1d scrub ok
Oct 10 09:48:23 compute-2 ceph-mon[74913]: 9.e scrub starts
Oct 10 09:48:23 compute-2 ceph-mon[74913]: 9.e scrub ok
Oct 10 09:48:23 compute-2 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 ceph-mon[74913]: osdmap e90: 3 total, 3 up, 3 in
Oct 10 09:48:23 compute-2 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:48:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:24 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89480014d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:24.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Oct 10 09:48:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:24.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:24 compute-2 ceph-mon[74913]: 7.b scrub starts
Oct 10 09:48:24 compute-2 ceph-mon[74913]: 7.b scrub ok
Oct 10 09:48:24 compute-2 ceph-mon[74913]: 11.5 scrub starts
Oct 10 09:48:24 compute-2 ceph-mon[74913]: 11.5 scrub ok
Oct 10 09:48:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:24 compute-2 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 09:48:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:24 compute-2 ceph-mon[74913]: pgmap v13: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 0 B/s wr, 12 op/s; 54 B/s, 2 objects/s recovering
Oct 10 09:48:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:24 compute-2 ceph-mon[74913]: osdmap e91: 3 total, 3 up, 3 in
Oct 10 09:48:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Oct 10 09:48:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:25 compute-2 ceph-mon[74913]: 11.4 scrub starts
Oct 10 09:48:25 compute-2 ceph-mon[74913]: 11.4 scrub ok
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:48:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:48:25 compute-2 ceph-mon[74913]: osdmap e92: 3 total, 3 up, 3 in
Oct 10 09:48:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:26 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:26.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Oct 10 09:48:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:26.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:26 compute-2 ceph-mon[74913]: 11.7 scrub starts
Oct 10 09:48:26 compute-2 ceph-mon[74913]: 11.7 scrub ok
Oct 10 09:48:26 compute-2 ceph-mon[74913]: pgmap v16: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 0 B/s wr, 12 op/s; 54 B/s, 2 objects/s recovering
Oct 10 09:48:26 compute-2 ceph-mon[74913]: osdmap e93: 3 total, 3 up, 3 in
Oct 10 09:48:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:27 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Oct 10 09:48:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:27 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Oct 10 09:48:27 compute-2 ceph-mon[74913]: 8.4 deep-scrub starts
Oct 10 09:48:27 compute-2 ceph-mon[74913]: 8.4 deep-scrub ok
Oct 10 09:48:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:28 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001670 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:28.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:28 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1a deep-scrub starts
Oct 10 09:48:28 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1a deep-scrub ok
Oct 10 09:48:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Oct 10 09:48:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.701593399s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 active pruub 168.973251343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.701540947s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 168.973251343s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.700680733s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 active pruub 168.973251343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.700636864s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 168.973251343s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:29 compute-2 ceph-mon[74913]: 11.1b scrub starts
Oct 10 09:48:29 compute-2 ceph-mon[74913]: 11.1b scrub ok
Oct 10 09:48:29 compute-2 ceph-mon[74913]: 8.1c scrub starts
Oct 10 09:48:29 compute-2 ceph-mon[74913]: 8.1c scrub ok
Oct 10 09:48:29 compute-2 ceph-mon[74913]: pgmap v18: 353 pgs: 353 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 116 B/s, 5 objects/s recovering
Oct 10 09:48:29 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Oct 10 09:48:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:29 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Oct 10 09:48:29 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Oct 10 09:48:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:30 compute-2 ceph-mon[74913]: 11.1d scrub starts
Oct 10 09:48:30 compute-2 ceph-mon[74913]: 11.1d scrub ok
Oct 10 09:48:30 compute-2 ceph-mon[74913]: 12.1a deep-scrub starts
Oct 10 09:48:30 compute-2 ceph-mon[74913]: 12.1a deep-scrub ok
Oct 10 09:48:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Oct 10 09:48:30 compute-2 ceph-mon[74913]: osdmap e94: 3 total, 3 up, 3 in
Oct 10 09:48:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Oct 10 09:48:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:30 compute-2 sudo[88506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:48:30 compute-2 sudo[88506]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:30 compute-2 sudo[88506]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:30 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:30.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:30 compute-2 sudo[88531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:48:30 compute-2 sudo[88531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:30 compute-2 sudo[88531]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:30 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Oct 10 09:48:30 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 12.1c scrub starts
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 12.1c scrub ok
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 11.1c deep-scrub starts
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 11.1c deep-scrub ok
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 12.17 scrub starts
Oct 10 09:48:31 compute-2 ceph-mon[74913]: 12.17 scrub ok
Oct 10 09:48:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:31 compute-2 ceph-mon[74913]: osdmap e95: 3 total, 3 up, 3 in
Oct 10 09:48:31 compute-2 ceph-mon[74913]: pgmap v21: 353 pgs: 353 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 116 B/s, 5 objects/s recovering
Oct 10 09:48:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Oct 10 09:48:31 compute-2 ceph-mon[74913]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Oct 10 09:48:31 compute-2 ceph-mon[74913]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Oct 10 09:48:31 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433774948s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 active pruub 171.756149292s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433749199s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 171.756149292s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433111191s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 active pruub 171.756118774s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433096886s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 171.756118774s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:31 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Oct 10 09:48:31 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Oct 10 09:48:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 7.8 scrub starts
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 7.8 scrub ok
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 8.12 scrub starts
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 8.12 scrub ok
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 9.1d scrub starts
Oct 10 09:48:32 compute-2 ceph-mon[74913]: 9.1d scrub ok
Oct 10 09:48:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Oct 10 09:48:32 compute-2 ceph-mon[74913]: osdmap e96: 3 total, 3 up, 3 in
Oct 10 09:48:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:48:32 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.980080605s) [1] async=[1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 51'1091 active pruub 177.331207275s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.979895592s) [1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 177.331207275s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.974352837s) [1] async=[1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 51'1091 active pruub 177.326797485s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.974159241s) [1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 177.326797485s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:32 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:32.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:32 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Oct 10 09:48:32 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 7.f scrub starts
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 7.f scrub ok
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 8.19 scrub starts
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 8.19 scrub ok
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 11.16 deep-scrub starts
Oct 10 09:48:33 compute-2 ceph-mon[74913]: 11.16 deep-scrub ok
Oct 10 09:48:33 compute-2 ceph-mon[74913]: osdmap e97: 3 total, 3 up, 3 in
Oct 10 09:48:33 compute-2 ceph-mon[74913]: pgmap v24: 353 pgs: 353 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Oct 10 09:48:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:33 compute-2 ceph-mon[74913]: Reconfiguring grafana.compute-0 (dependencies changed)...
Oct 10 09:48:33 compute-2 ceph-mon[74913]: Reconfiguring daemon grafana.compute-0 on compute-0
Oct 10 09:48:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 98 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] async=[1] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 98 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] async=[1] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.675864220s) [1] async=[1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 51'1091 active pruub 179.389389038s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.675802231s) [1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 179.389389038s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.674968719s) [1] async=[1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 51'1091 active pruub 179.389450073s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.674754143s) [1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 179.389450073s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:33 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Oct 10 09:48:33 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Oct 10 09:48:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:34 compute-2 sshd-session[88560]: Accepted publickey for zuul from 192.168.122.30 port 59148 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:48:34 compute-2 systemd-logind[796]: New session 39 of user zuul.
Oct 10 09:48:34 compute-2 systemd[1]: Started Session 39 of User zuul.
Oct 10 09:48:34 compute-2 sshd-session[88560]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 7.4 scrub starts
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 7.4 scrub ok
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 11.1e scrub starts
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 11.1e scrub ok
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 12.3 scrub starts
Oct 10 09:48:34 compute-2 ceph-mon[74913]: 12.3 scrub ok
Oct 10 09:48:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Oct 10 09:48:34 compute-2 ceph-mon[74913]: osdmap e98: 3 total, 3 up, 3 in
Oct 10 09:48:34 compute-2 ceph-mon[74913]: osdmap e99: 3 total, 3 up, 3 in
Oct 10 09:48:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:34 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Oct 10 09:48:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:34 compute-2 python3.9[88714]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 10 09:48:34 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Oct 10 09:48:34 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Oct 10 09:48:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 7.3 deep-scrub starts
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 7.3 deep-scrub ok
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 9.11 scrub starts
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 9.11 scrub ok
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 11.17 scrub starts
Oct 10 09:48:35 compute-2 ceph-mon[74913]: 11.17 scrub ok
Oct 10 09:48:35 compute-2 ceph-mon[74913]: pgmap v27: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 54 B/s, 1 objects/s recovering
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Oct 10 09:48:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:35 compute-2 ceph-mon[74913]: osdmap e100: 3 total, 3 up, 3 in
Oct 10 09:48:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Oct 10 09:48:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:35 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.9 deep-scrub starts
Oct 10 09:48:35 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.9 deep-scrub ok
Oct 10 09:48:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 12.a scrub starts
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 12.a scrub ok
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 11.1a scrub starts
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 11.1a scrub ok
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 9.13 scrub starts
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 9.13 scrub ok
Oct 10 09:48:36 compute-2 ceph-mon[74913]: osdmap e101: 3 total, 3 up, 3 in
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 9.12 scrub starts
Oct 10 09:48:36 compute-2 ceph-mon[74913]: 9.12 scrub ok
Oct 10 09:48:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:36 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:36 compute-2 python3.9[88889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:48:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:36.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Oct 10 09:48:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:36.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:36 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.e scrub starts
Oct 10 09:48:36 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.e scrub ok
Oct 10 09:48:37 compute-2 sudo[89045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgbmxnhjpbfyxygnxiztsodqofkyvtpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089716.8554068-95-29981070853873/AnsiballZ_command.py'
Oct 10 09:48:37 compute-2 sudo[89045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:37 compute-2 python3.9[89047]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:48:37 compute-2 sudo[89045]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 7.2 scrub starts
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 7.2 scrub ok
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 12.9 deep-scrub starts
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 12.9 deep-scrub ok
Oct 10 09:48:37 compute-2 ceph-mon[74913]: pgmap v30: 353 pgs: 2 remapped+peering, 2 peering, 349 active+clean; 457 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 54 B/s, 1 objects/s recovering
Oct 10 09:48:37 compute-2 ceph-mon[74913]: osdmap e102: 3 total, 3 up, 3 in
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 11.14 scrub starts
Oct 10 09:48:37 compute-2 ceph-mon[74913]: 11.14 scrub ok
Oct 10 09:48:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:37 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Oct 10 09:48:37 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Oct 10 09:48:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:38 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:38.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:38 compute-2 sudo[89198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmzoildaekrwfgreurycaosilcueiqyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089717.9948645-131-193936976086094/AnsiballZ_stat.py'
Oct 10 09:48:38 compute-2 sudo[89198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 7.6 scrub starts
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 7.6 scrub ok
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 11.e scrub starts
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 11.e scrub ok
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 12.8 scrub starts
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 12.8 scrub ok
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 9.a scrub starts
Oct 10 09:48:38 compute-2 ceph-mon[74913]: 9.a scrub ok
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:48:38 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:48:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:38.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:38 compute-2 python3.9[89200]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:48:38 compute-2 sudo[89198]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:38 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.d scrub starts
Oct 10 09:48:38 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.d scrub ok
Oct 10 09:48:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Oct 10 09:48:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 11.3 scrub starts
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 11.3 scrub ok
Oct 10 09:48:39 compute-2 ceph-mon[74913]: pgmap v32: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 8 B/s, 4 objects/s recovering
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 12.b scrub starts
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 12.b scrub ok
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 9.f scrub starts
Oct 10 09:48:39 compute-2 ceph-mon[74913]: 9.f scrub ok
Oct 10 09:48:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Oct 10 09:48:39 compute-2 ceph-mon[74913]: osdmap e103: 3 total, 3 up, 3 in
Oct 10 09:48:39 compute-2 sudo[89354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruuzjfpjofbhmdimkmboljfjhntwjrnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089719.1467757-165-212414177179711/AnsiballZ_file.py'
Oct 10 09:48:39 compute-2 sudo[89354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:39 compute-2 python3.9[89356]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:48:39 compute-2 sudo[89354]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:39 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Oct 10 09:48:39 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Oct 10 09:48:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:40 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:40.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Oct 10 09:48:40 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 104 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=104) [2] r=0 lpr=104 pi=[82,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:40 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 104 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=104) [2] r=0 lpr=104 pi=[82,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 8.d scrub starts
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 8.d scrub ok
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 12.6 scrub starts
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 12.6 scrub ok
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 9.6 scrub starts
Oct 10 09:48:40 compute-2 ceph-mon[74913]: 9.6 scrub ok
Oct 10 09:48:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Oct 10 09:48:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:40.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:40 compute-2 python3.9[89507]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:48:40 compute-2 network[89524]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:48:40 compute-2 network[89525]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:48:40 compute-2 network[89526]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:48:40 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Oct 10 09:48:40 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Oct 10 09:48:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:41 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Oct 10 09:48:41 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:41 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:41 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:41 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 8.16 scrub starts
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 8.16 scrub ok
Oct 10 09:48:41 compute-2 ceph-mon[74913]: pgmap v34: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 7 B/s, 3 objects/s recovering
Oct 10 09:48:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Oct 10 09:48:41 compute-2 ceph-mon[74913]: osdmap e104: 3 total, 3 up, 3 in
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 12.c scrub starts
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 12.c scrub ok
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 8.1b scrub starts
Oct 10 09:48:41 compute-2 ceph-mon[74913]: 8.1b scrub ok
Oct 10 09:48:41 compute-2 ceph-mon[74913]: osdmap e105: 3 total, 3 up, 3 in
Oct 10 09:48:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002390 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:41 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Oct 10 09:48:41 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Oct 10 09:48:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cf0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:42 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:42.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:42 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Oct 10 09:48:42 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 106 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=106) [2] r=0 lpr=106 pi=[56,106)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 12.18 scrub starts
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 12.18 scrub ok
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 7.9 scrub starts
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 7.9 scrub ok
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 8.18 scrub starts
Oct 10 09:48:42 compute-2 ceph-mon[74913]: 8.18 scrub ok
Oct 10 09:48:42 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Oct 10 09:48:42 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Oct 10 09:48:42 compute-2 ceph-mon[74913]: osdmap e106: 3 total, 3 up, 3 in
Oct 10 09:48:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:42.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:42 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Oct 10 09:48:42 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Oct 10 09:48:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:43 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[56,107)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:43 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[56,107)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 10.4 scrub starts
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 10.4 scrub ok
Oct 10 09:48:43 compute-2 ceph-mon[74913]: pgmap v37: 353 pgs: 353 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 7 B/s, 3 objects/s recovering
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 7.e scrub starts
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 7.e scrub ok
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 10.16 scrub starts
Oct 10 09:48:43 compute-2 ceph-mon[74913]: 10.16 scrub ok
Oct 10 09:48:43 compute-2 ceph-mon[74913]: osdmap e107: 3 total, 3 up, 3 in
Oct 10 09:48:43 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Oct 10 09:48:43 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Oct 10 09:48:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:44 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003d10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:44.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:44 compute-2 sudo[89644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:48:44 compute-2 sudo[89644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:44 compute-2 sudo[89644]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Oct 10 09:48:44 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 108 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:44 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 108 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 10.13 scrub starts
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 10.13 scrub ok
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 7.1e deep-scrub starts
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 7.1e deep-scrub ok
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 10.e scrub starts
Oct 10 09:48:44 compute-2 ceph-mon[74913]: 10.e scrub ok
Oct 10 09:48:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:48:44 compute-2 ceph-mon[74913]: osdmap e108: 3 total, 3 up, 3 in
Oct 10 09:48:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:44.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:44 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Oct 10 09:48:44 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Oct 10 09:48:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Oct 10 09:48:45 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 109 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:45 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 109 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 10.14 scrub starts
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 10.14 scrub ok
Oct 10 09:48:45 compute-2 ceph-mon[74913]: pgmap v40: 353 pgs: 2 remapped+peering, 351 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 12.10 scrub starts
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 12.10 scrub ok
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 10.11 scrub starts
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 10.c scrub starts
Oct 10 09:48:45 compute-2 ceph-mon[74913]: 10.c scrub ok
Oct 10 09:48:45 compute-2 ceph-mon[74913]: osdmap e109: 3 total, 3 up, 3 in
Oct 10 09:48:45 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Oct 10 09:48:45 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Oct 10 09:48:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:46 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:46.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Oct 10 09:48:46 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 110 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=109/110 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:46 compute-2 python3.9[89819]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 10.11 scrub ok
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 12.e scrub starts
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 12.e scrub ok
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 10.3 scrub starts
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 10.a deep-scrub starts
Oct 10 09:48:46 compute-2 ceph-mon[74913]: 10.a deep-scrub ok
Oct 10 09:48:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:48:46 compute-2 ceph-mon[74913]: osdmap e110: 3 total, 3 up, 3 in
Oct 10 09:48:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:46.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:46 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.f scrub starts
Oct 10 09:48:46 compute-2 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.f scrub ok
Oct 10 09:48:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:47 compute-2 python3.9[89971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:48:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003d30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 10.3 scrub ok
Oct 10 09:48:47 compute-2 ceph-mon[74913]: pgmap v43: 353 pgs: 2 remapped+peering, 351 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 7.1b scrub starts
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 7.1b scrub ok
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 10.f scrub starts
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 10.f scrub ok
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 10.9 scrub starts
Oct 10 09:48:47 compute-2 ceph-mon[74913]: 10.9 scrub ok
Oct 10 09:48:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:48 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:48.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:48.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Oct 10 09:48:48 compute-2 ceph-mon[74913]: 12.12 scrub starts
Oct 10 09:48:48 compute-2 ceph-mon[74913]: 12.12 scrub ok
Oct 10 09:48:48 compute-2 ceph-mon[74913]: 10.b scrub starts
Oct 10 09:48:48 compute-2 ceph-mon[74913]: 10.b scrub ok
Oct 10 09:48:48 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Oct 10 09:48:48 compute-2 python3.9[90126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:48:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:49 compute-2 sudo[90283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iolfiovwqbenjrlyuzgouqpfaslhpafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089729.4177904-309-189809139430663/AnsiballZ_setup.py'
Oct 10 09:48:49 compute-2 sudo[90283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:49 compute-2 ceph-mon[74913]: pgmap v45: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Oct 10 09:48:49 compute-2 ceph-mon[74913]: osdmap e111: 3 total, 3 up, 3 in
Oct 10 09:48:49 compute-2 ceph-mon[74913]: 7.10 scrub starts
Oct 10 09:48:49 compute-2 ceph-mon[74913]: 7.10 scrub ok
Oct 10 09:48:49 compute-2 ceph-mon[74913]: 10.6 scrub starts
Oct 10 09:48:49 compute-2 ceph-mon[74913]: 10.6 scrub ok
Oct 10 09:48:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:50 compute-2 python3.9[90285]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:48:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:50 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:50 compute-2 sudo[90291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:48:50 compute-2 sudo[90291]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:48:50 compute-2 sudo[90291]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:50 compute-2 sudo[90283]: pam_unix(sudo:session): session closed for user root
Oct 10 09:48:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:48:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:50.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:48:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Oct 10 09:48:50 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 112 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=112) [2] r=0 lpr=112 pi=[66,112)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:50 compute-2 ceph-mon[74913]: 12.19 scrub starts
Oct 10 09:48:50 compute-2 ceph-mon[74913]: 12.19 scrub ok
Oct 10 09:48:50 compute-2 ceph-mon[74913]: 10.19 scrub starts
Oct 10 09:48:50 compute-2 ceph-mon[74913]: 10.19 scrub ok
Oct 10 09:48:50 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Oct 10 09:48:50 compute-2 sudo[90394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sggjknnadfzepbpwkhklwjooykywlcrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089729.4177904-309-189809139430663/AnsiballZ_dnf.py'
Oct 10 09:48:50 compute-2 sudo[90394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:48:51 compute-2 python3.9[90396]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:48:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:51 compute-2 ceph-mon[74913]: pgmap v47: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:51 compute-2 ceph-mon[74913]: 7.18 scrub starts
Oct 10 09:48:51 compute-2 ceph-mon[74913]: 7.18 scrub ok
Oct 10 09:48:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Oct 10 09:48:51 compute-2 ceph-mon[74913]: osdmap e112: 3 total, 3 up, 3 in
Oct 10 09:48:51 compute-2 ceph-mon[74913]: 10.1a scrub starts
Oct 10 09:48:51 compute-2 ceph-mon[74913]: 10.1a scrub ok
Oct 10 09:48:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Oct 10 09:48:51 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[66,113)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:51 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[66,113)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:52 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:52.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:52.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Oct 10 09:48:52 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 114 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=9.957026482s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1091 mlcod 0'0 active pruub 192.967025757s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:52 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 114 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=9.956990242s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 192.967025757s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:52 compute-2 ceph-mon[74913]: 10.2 deep-scrub starts
Oct 10 09:48:52 compute-2 ceph-mon[74913]: 10.2 deep-scrub ok
Oct 10 09:48:52 compute-2 ceph-mon[74913]: osdmap e113: 3 total, 3 up, 3 in
Oct 10 09:48:52 compute-2 ceph-mon[74913]: 10.1c scrub starts
Oct 10 09:48:52 compute-2 ceph-mon[74913]: 10.1c scrub ok
Oct 10 09:48:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Oct 10 09:48:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Oct 10 09:48:53 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:53 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:53 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:53 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:53 compute-2 ceph-mon[74913]: pgmap v50: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:53 compute-2 ceph-mon[74913]: 10.5 deep-scrub starts
Oct 10 09:48:53 compute-2 ceph-mon[74913]: 10.5 deep-scrub ok
Oct 10 09:48:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Oct 10 09:48:53 compute-2 ceph-mon[74913]: osdmap e114: 3 total, 3 up, 3 in
Oct 10 09:48:53 compute-2 ceph-mon[74913]: 10.1d scrub starts
Oct 10 09:48:53 compute-2 ceph-mon[74913]: 10.1d scrub ok
Oct 10 09:48:53 compute-2 ceph-mon[74913]: osdmap e115: 3 total, 3 up, 3 in
Oct 10 09:48:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:54 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002810 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Oct 10 09:48:54 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 116 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:54 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 116 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] async=[0] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:48:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:54.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:54 compute-2 ceph-mon[74913]: 10.1e scrub starts
Oct 10 09:48:54 compute-2 ceph-mon[74913]: 10.1e scrub ok
Oct 10 09:48:54 compute-2 ceph-mon[74913]: osdmap e116: 3 total, 3 up, 3 in
Oct 10 09:48:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Oct 10 09:48:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 117 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.093819618s) [0] async=[0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1091 mlcod 51'1091 active pruub 200.847595215s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:55 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 117 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.093749046s) [0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 200.847595215s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:55 compute-2 ceph-mon[74913]: pgmap v53: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:55 compute-2 ceph-mon[74913]: osdmap e117: 3 total, 3 up, 3 in
Oct 10 09:48:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:56 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Oct 10 09:48:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:56.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:57 compute-2 ceph-mon[74913]: pgmap v56: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:57 compute-2 ceph-mon[74913]: osdmap e118: 3 total, 3 up, 3 in
Oct 10 09:48:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:58 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:48:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:58.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:48:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Oct 10 09:48:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Oct 10 09:48:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 119 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=119 pruub=14.952677727s) [0] r=-1 lpr=119 pi=[72,119)/1 crt=51'1091 mlcod 0'0 active pruub 203.756973267s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:58 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 119 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=119 pruub=14.952638626s) [0] r=-1 lpr=119 pi=[72,119)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 203.756973267s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:48:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:48:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:48:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:58.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:48:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:48:59 compute-2 ceph-mon[74913]: pgmap v58: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:48:59 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Oct 10 09:48:59 compute-2 ceph-mon[74913]: osdmap e119: 3 total, 3 up, 3 in
Oct 10 09:48:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Oct 10 09:48:59 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 120 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:48:59 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 120 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:48:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:48:59 compute-2 systemd[82041]: Starting Mark boot as successful...
Oct 10 09:48:59 compute-2 systemd[82041]: Finished Mark boot as successful.
Oct 10 09:48:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 09:48:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:00 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:49:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:00.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:49:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:00 compute-2 ceph-mon[74913]: osdmap e120: 3 total, 3 up, 3 in
Oct 10 09:49:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Oct 10 09:49:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Oct 10 09:49:00 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 121 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:49:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:49:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:00.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:49:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:01 compute-2 ceph-mon[74913]: pgmap v61: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:49:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Oct 10 09:49:01 compute-2 ceph-mon[74913]: osdmap e121: 3 total, 3 up, 3 in
Oct 10 09:49:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:49:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Oct 10 09:49:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 122 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=120/72 les/c/f=121/73/0 sis=122 pruub=14.985898972s) [0] async=[0] r=-1 lpr=122 pi=[72,122)/1 crt=51'1091 mlcod 51'1091 active pruub 206.866821289s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:01 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 122 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=120/72 les/c/f=121/73/0 sis=122 pruub=14.985840797s) [0] r=-1 lpr=122 pi=[72,122)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 206.866821289s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:49:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:02 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:02.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Oct 10 09:49:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:02.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:02 compute-2 ceph-mon[74913]: osdmap e122: 3 total, 3 up, 3 in
Oct 10 09:49:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Oct 10 09:49:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:03 compute-2 ceph-mon[74913]: pgmap v64: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:49:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Oct 10 09:49:03 compute-2 ceph-mon[74913]: osdmap e123: 3 total, 3 up, 3 in
Oct 10 09:49:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:04 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:04.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:05 compute-2 ceph-mon[74913]: pgmap v66: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 1 objects/s recovering
Oct 10 09:49:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:06 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:06.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:07 compute-2 ceph-mon[74913]: pgmap v67: 353 pgs: 1 peering, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Oct 10 09:49:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:08 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:08 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Oct 10 09:49:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Oct 10 09:49:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:09 compute-2 ceph-mon[74913]: pgmap v68: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 403 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Oct 10 09:49:09 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Oct 10 09:49:09 compute-2 ceph-mon[74913]: osdmap e124: 3 total, 3 up, 3 in
Oct 10 09:49:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:10 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:10.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:10 compute-2 sudo[90535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:49:10 compute-2 sudo[90535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:10 compute-2 sudo[90535]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:10 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Oct 10 09:49:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Oct 10 09:49:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8930000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:11 compute-2 ceph-mon[74913]: pgmap v70: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Oct 10 09:49:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Oct 10 09:49:11 compute-2 ceph-mon[74913]: osdmap e125: 3 total, 3 up, 3 in
Oct 10 09:49:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f892c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:12 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:12 compute-2 ceph-mon[74913]: pgmap v72: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Oct 10 09:49:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Oct 10 09:49:12 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Oct 10 09:49:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Oct 10 09:49:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Oct 10 09:49:13 compute-2 ceph-mon[74913]: osdmap e126: 3 total, 3 up, 3 in
Oct 10 09:49:13 compute-2 ceph-mon[74913]: osdmap e127: 3 total, 3 up, 3 in
Oct 10 09:49:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:14 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f892c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:14.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Oct 10 09:49:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:14 compute-2 ceph-mon[74913]: pgmap v75: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:49:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Oct 10 09:49:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Oct 10 09:49:14 compute-2 ceph-mon[74913]: osdmap e128: 3 total, 3 up, 3 in
Oct 10 09:49:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Oct 10 09:49:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:16 compute-2 kernel: ganesha.nfsd[90534]: segfault at 50 ip 00007f8a14c6a32e sp 00007f89d0ff8210 error 4 in libntirpc.so.5.8[7f8a14c4f000+2c000] likely on CPU 5 (core 0, socket 5)
Oct 10 09:49:16 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 09:49:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:16 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy ignored for local
Oct 10 09:49:16 compute-2 systemd[1]: Created slice Slice /system/systemd-coredump.
Oct 10 09:49:16 compute-2 systemd[1]: Started Process Core Dump (PID 90582/UID 0).
Oct 10 09:49:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:16 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Oct 10 09:49:16 compute-2 ceph-mon[74913]: osdmap e129: 3 total, 3 up, 3 in
Oct 10 09:49:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Oct 10 09:49:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:49:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:49:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:49:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094916 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:49:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:17 compute-2 systemd-coredump[90583]: Process 85054 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 63:
                                                   #0  0x00007f8a14c6a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Oct 10 09:49:17 compute-2 systemd[1]: systemd-coredump@0-90582-0.service: Deactivated successfully.
Oct 10 09:49:17 compute-2 systemd[1]: systemd-coredump@0-90582-0.service: Consumed 1.182s CPU time.
Oct 10 09:49:17 compute-2 ceph-mon[74913]: pgmap v78: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:49:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Oct 10 09:49:17 compute-2 ceph-mon[74913]: osdmap e130: 3 total, 3 up, 3 in
Oct 10 09:49:17 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Oct 10 09:49:17 compute-2 podman[90598]: 2025-10-10 09:49:17.558629243 +0000 UTC m=+0.027462861 container died c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Oct 10 09:49:17 compute-2 systemd[1]: var-lib-containers-storage-overlay-65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9-merged.mount: Deactivated successfully.
Oct 10 09:49:17 compute-2 podman[90598]: 2025-10-10 09:49:17.591254203 +0000 UTC m=+0.060087811 container remove c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:49:17 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 09:49:17 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 09:49:17 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.693s CPU time.
Oct 10 09:49:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:18.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:18 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Oct 10 09:49:18 compute-2 ceph-mon[74913]: osdmap e131: 3 total, 3 up, 3 in
Oct 10 09:49:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:18.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:19 compute-2 ceph-mon[74913]: pgmap v81: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 0 B/s, 1 objects/s recovering
Oct 10 09:49:19 compute-2 ceph-mon[74913]: osdmap e132: 3 total, 3 up, 3 in
Oct 10 09:49:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Oct 10 09:49:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:20.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:20 compute-2 ceph-mon[74913]: osdmap e133: 3 total, 3 up, 3 in
Oct 10 09:49:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Oct 10 09:49:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:20.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:21 compute-2 ceph-mon[74913]: pgmap v84: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s; 0 B/s, 1 objects/s recovering
Oct 10 09:49:21 compute-2 ceph-mon[74913]: osdmap e134: 3 total, 3 up, 3 in
Oct 10 09:49:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094922 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:49:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:22.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:23 compute-2 ceph-mon[74913]: pgmap v86: 353 pgs: 1 unknown, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Oct 10 09:49:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Oct 10 09:49:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Oct 10 09:49:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:25 compute-2 ceph-mon[74913]: pgmap v87: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 36 B/s, 0 objects/s recovering
Oct 10 09:49:25 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Oct 10 09:49:25 compute-2 ceph-mon[74913]: osdmap e135: 3 total, 3 up, 3 in
Oct 10 09:49:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:26.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Oct 10 09:49:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Oct 10 09:49:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:27 compute-2 ceph-mon[74913]: pgmap v89: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 33 B/s, 0 objects/s recovering
Oct 10 09:49:27 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Oct 10 09:49:27 compute-2 ceph-mon[74913]: osdmap e136: 3 total, 3 up, 3 in
Oct 10 09:49:27 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 1.
Oct 10 09:49:27 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:49:27 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.693s CPU time.
Oct 10 09:49:27 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:49:28 compute-2 podman[90703]: 2025-10-10 09:49:28.13715586 +0000 UTC m=+0.036707547 container create dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 09:49:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:49:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:49:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:49:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:49:28 compute-2 podman[90703]: 2025-10-10 09:49:28.193260015 +0000 UTC m=+0.092811722 container init dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct 10 09:49:28 compute-2 podman[90703]: 2025-10-10 09:49:28.198277909 +0000 UTC m=+0.097829596 container start dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:49:28 compute-2 bash[90703]: dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7
Oct 10 09:49:28 compute-2 podman[90703]: 2025-10-10 09:49:28.121464258 +0000 UTC m=+0.021015965 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:49:28 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:49:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:28.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:28 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Oct 10 09:49:28 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 137 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=137) [2] r=0 lpr=137 pi=[80,137)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:49:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Oct 10 09:49:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:29 compute-2 ceph-mon[74913]: pgmap v91: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 674 B/s wr, 1 op/s; 28 B/s, 0 objects/s recovering
Oct 10 09:49:29 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Oct 10 09:49:29 compute-2 ceph-mon[74913]: osdmap e137: 3 total, 3 up, 3 in
Oct 10 09:49:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Oct 10 09:49:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 138 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=138) [2]/[1] r=-1 lpr=138 pi=[80,138)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:29 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 138 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=138) [2]/[1] r=-1 lpr=138 pi=[80,138)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 09:49:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:49:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:49:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:30 compute-2 sudo[90763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:49:30 compute-2 sudo[90763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:30 compute-2 sudo[90763]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Oct 10 09:49:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 139 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=139 pruub=9.683565140s) [1] r=-1 lpr=139 pi=[107,139)/1 crt=51'1091 mlcod 0'0 active pruub 230.735855103s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:30 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 139 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=139 pruub=9.683433533s) [1] r=-1 lpr=139 pi=[107,139)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 230.735855103s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:49:30 compute-2 ceph-mon[74913]: osdmap e138: 3 total, 3 up, 3 in
Oct 10 09:49:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 09:49:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:31 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Oct 10 09:49:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 09:49:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:31 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 09:49:31 compute-2 ceph-mon[74913]: pgmap v94: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 925 B/s wr, 2 op/s
Oct 10 09:49:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 09:49:31 compute-2 ceph-mon[74913]: osdmap e139: 3 total, 3 up, 3 in
Oct 10 09:49:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:49:31 compute-2 ceph-mon[74913]: osdmap e140: 3 total, 3 up, 3 in
Oct 10 09:49:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:32.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:32.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:32 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Oct 10 09:49:32 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 141 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:49:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 141 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] async=[1] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 09:49:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:33 compute-2 sudo[90394]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:33 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Oct 10 09:49:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=140/107 les/c/f=141/108/0 sis=142 pruub=15.522687912s) [1] async=[1] r=-1 lpr=142 pi=[107,142)/1 crt=51'1091 mlcod 51'1091 active pruub 239.268997192s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 09:49:33 compute-2 ceph-osd[77423]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=140/107 les/c/f=141/108/0 sis=142 pruub=15.522623062s) [1] r=-1 lpr=142 pi=[107,142)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 239.268997192s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 09:49:33 compute-2 ceph-mon[74913]: pgmap v97: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Oct 10 09:49:33 compute-2 ceph-mon[74913]: osdmap e141: 3 total, 3 up, 3 in
Oct 10 09:49:33 compute-2 ceph-mon[74913]: osdmap e142: 3 total, 3 up, 3 in
Oct 10 09:49:34 compute-2 sudo[90941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqmwdjvjzxchblgiqwyixozojxxbionr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089773.7740886-345-180216362684209/AnsiballZ_command.py'
Oct 10 09:49:34 compute-2 sudo[90941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 09:49:34 compute-2 python3.9[90943]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:49:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:34.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 09:49:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:49:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:49:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:35 compute-2 sudo[90941]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:35 compute-2 ceph-mon[74913]: pgmap v100: 353 pgs: 1 remapped+peering, 1 peering, 351 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.7 KiB/s wr, 5 op/s; 0 B/s, 1 objects/s recovering
Oct 10 09:49:35 compute-2 ceph-mon[74913]: osdmap e143: 3 total, 3 up, 3 in
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.527277) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775527312, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3147, "num_deletes": 252, "total_data_size": 9639001, "memory_usage": 9779192, "flush_reason": "Manual Compaction"}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775557627, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6118759, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7607, "largest_seqno": 10749, "table_properties": {"data_size": 6104678, "index_size": 9103, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 34854, "raw_average_key_size": 22, "raw_value_size": 6074200, "raw_average_value_size": 3949, "num_data_blocks": 395, "num_entries": 1538, "num_filter_entries": 1538, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089676, "oldest_key_time": 1760089676, "file_creation_time": 1760089775, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 30439 microseconds, and 12322 cpu microseconds.
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.557704) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6118759 bytes OK
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.557735) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559100) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559117) EVENT_LOG_v1 {"time_micros": 1760089775559112, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559146) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9623836, prev total WAL file size 9623836, number of live WAL files 2.
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.561223) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5975KB)], [18(10MB)]
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775561299, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17453573, "oldest_snapshot_seqno": -1}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4077 keys, 13425836 bytes, temperature: kUnknown
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775629611, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13425836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13393323, "index_size": 21203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 104148, "raw_average_key_size": 25, "raw_value_size": 13313454, "raw_average_value_size": 3265, "num_data_blocks": 912, "num_entries": 4077, "num_filter_entries": 4077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089775, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.630082) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13425836 bytes
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.632416) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 255.1 rd, 196.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.8, 10.8 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(5.0) write-amplify(2.2) OK, records in: 4615, records dropped: 538 output_compression: NoCompression
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.632450) EVENT_LOG_v1 {"time_micros": 1760089775632435, "job": 8, "event": "compaction_finished", "compaction_time_micros": 68416, "compaction_time_cpu_micros": 30577, "output_level": 6, "num_output_files": 1, "total_output_size": 13425836, "num_input_records": 4615, "num_output_records": 4077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775634690, "job": 8, "event": "table_file_deletion", "file_number": 20}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775638264, "job": 8, "event": "table_file_deletion", "file_number": 18}
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.561062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:35 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:49:36 compute-2 sudo[91230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzjdlygpcfaaswbqthebgfqwnnmvltdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089775.6537836-371-116868832858848/AnsiballZ_selinux.py'
Oct 10 09:49:36 compute-2 sudo[91230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:49:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:36.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:49:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:36 compute-2 python3.9[91232]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 10 09:49:36 compute-2 sudo[91230]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094936 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:49:37 compute-2 sudo[91384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnmdexpagjxiijbwklwpfdsgmnlefrgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089777.0192778-402-656377315900/AnsiballZ_command.py'
Oct 10 09:49:37 compute-2 sudo[91384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:37 compute-2 python3.9[91386]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 10 09:49:37 compute-2 sudo[91384]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:37 compute-2 ceph-mon[74913]: pgmap v102: 353 pgs: 1 remapped+peering, 1 peering, 351 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.6 KiB/s wr, 5 op/s; 0 B/s, 1 objects/s recovering
Oct 10 09:49:38 compute-2 sudo[91536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-simwbiivxozurtpaskddimakxhjxdoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089777.8344467-425-196479819596990/AnsiballZ_file.py'
Oct 10 09:49:38 compute-2 sudo[91536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:38.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:38 compute-2 python3.9[91538]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:49:38 compute-2 sudo[91536]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:39 compute-2 sudo[91690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndmpaksemqxfiptabrkoyqjtygjvhzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089778.8277366-450-168374002670437/AnsiballZ_mount.py'
Oct 10 09:49:39 compute-2 sudo[91690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:39 compute-2 python3.9[91692]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 10 09:49:39 compute-2 sudo[91690]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:39 compute-2 ceph-mon[74913]: pgmap v103: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s; 18 B/s, 1 objects/s recovering
Oct 10 09:49:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:40.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000005:nfs.cephfs.1: -2
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:49:40 compute-2 sudo[91844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvnexqlrmwguxuridwgduppvgrwvtvpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089780.4762592-534-236848524936801/AnsiballZ_file.py'
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:49:40 compute-2 sudo[91844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:49:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:49:40 compute-2 python3.9[91852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:49:40 compute-2 sudo[91844]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:41 compute-2 sudo[92007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkwhmivkyevvwspkswjylfpoloumunhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089781.2381256-558-89049592879871/AnsiballZ_stat.py'
Oct 10 09:49:41 compute-2 sudo[92007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:41 compute-2 ceph-mon[74913]: pgmap v104: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 829 B/s wr, 2 op/s; 14 B/s, 0 objects/s recovering
Oct 10 09:49:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:41 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda7c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:41 compute-2 python3.9[92009]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:49:41 compute-2 sudo[92007]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:41 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:41 compute-2 sudo[92089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfdlhdkkluwstetjximtlpdmbokqbklj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089781.2381256-558-89049592879871/AnsiballZ_file.py'
Oct 10 09:49:41 compute-2 sudo[92089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:42 compute-2 python3.9[92091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:49:42 compute-2 sudo[92089]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:42 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:42.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:49:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:49:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:43 compute-2 ceph-mon[74913]: pgmap v105: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.0 KiB/s wr, 4 op/s; 12 B/s, 0 objects/s recovering
Oct 10 09:49:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:43 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:43 compute-2 sudo[92243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdvkjpynouovzxkmnnhorgbalmybplko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089783.3516426-630-215797700054434/AnsiballZ_getent.py'
Oct 10 09:49:43 compute-2 sudo[92243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:43 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:44 compute-2 python3.9[92245]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 10 09:49:44 compute-2 sudo[92243]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094944 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:49:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:44 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:44.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:44 compute-2 sudo[92300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:49:44 compute-2 sudo[92300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:44 compute-2 sudo[92300]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:44 compute-2 sudo[92350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:49:44 compute-2 sudo[92350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:44 compute-2 sudo[92447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhgwvopsuuahuklvfmkayqvhysnrixav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089784.4533207-660-136793458965002/AnsiballZ_getent.py'
Oct 10 09:49:44 compute-2 sudo[92447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:44.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:44 compute-2 python3.9[92449]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 10 09:49:44 compute-2 sudo[92447]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:45 compute-2 sudo[92350]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:45 compute-2 ceph-mon[74913]: pgmap v106: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 921 B/s wr, 3 op/s; 10 B/s, 0 objects/s recovering
Oct 10 09:49:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:45 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:45 compute-2 sudo[92631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcanznexkmspboertyrjflszdcdupxnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089785.3099785-684-41655822814129/AnsiballZ_group.py'
Oct 10 09:49:45 compute-2 sudo[92631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:45 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:46 compute-2 python3.9[92633]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 09:49:46 compute-2 sudo[92631]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:46 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:46.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:46 compute-2 sudo[92784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfdgqehtloamyhdqhqqayeggbbnjwgix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089786.3616867-711-124742383054163/AnsiballZ_file.py'
Oct 10 09:49:46 compute-2 sudo[92784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:49:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:46.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:46 compute-2 python3.9[92786]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 10 09:49:46 compute-2 sudo[92784]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:47 compute-2 sudo[92937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxgwtdrbsrygmsvyckkjqppgtiflyeud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089787.3390927-743-256291102460540/AnsiballZ_dnf.py'
Oct 10 09:49:47 compute-2 sudo[92937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:47 compute-2 ceph-mon[74913]: pgmap v107: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 785 B/s wr, 3 op/s; 9 B/s, 0 objects/s recovering
Oct 10 09:49:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:47 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:47 compute-2 python3.9[92939]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:49:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:47 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:48 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 09:49:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:48.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 09:49:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:49 compute-2 ceph-mon[74913]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 1023 B/s wr, 3 op/s; 9 B/s, 0 objects/s recovering
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:49:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:49:49 compute-2 sudo[92937]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:49 compute-2 sudo[93092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naqoajqnmylykipwisjdybjtlyltgmrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089789.3590863-768-145163908672903/AnsiballZ_file.py'
Oct 10 09:49:49 compute-2 sudo[93092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:49 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:49 compute-2 python3.9[93094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:49:49 compute-2 sudo[93092]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:49 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:50 compute-2 ceph-mon[74913]: mgrmap e33: compute-0.xkdepb(active, since 92s), standbys: compute-1.rfugxc, compute-2.gkrssp
Oct 10 09:49:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:50 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:50 compute-2 sudo[93244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpundmfirxexjoehthvxdrzdyvyupygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089790.1428854-792-170090792618887/AnsiballZ_stat.py'
Oct 10 09:49:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:50.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:50 compute-2 sudo[93244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:50 compute-2 sudo[93247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:49:50 compute-2 sudo[93247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:50 compute-2 sudo[93247]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:50 compute-2 python3.9[93246]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:49:50 compute-2 sudo[93244]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:50 compute-2 sudo[93349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aasceyfanazvfgzpdisfjcnpkbjjthmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089790.1428854-792-170090792618887/AnsiballZ_file.py'
Oct 10 09:49:50 compute-2 sudo[93349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:51 compute-2 python3.9[93351]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:49:51 compute-2 sudo[93349]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:51 compute-2 ceph-mon[74913]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 511 B/s wr, 1 op/s
Oct 10 09:49:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:51 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:51 compute-2 sudo[93501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptdksyikmxublwdhcawtimypvjdhmsfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089791.5609117-831-100651694292539/AnsiballZ_stat.py'
Oct 10 09:49:51 compute-2 sudo[93501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:51 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:52 compute-2 python3.9[93503]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:49:52 compute-2 sudo[93501]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:52 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:52 compute-2 sudo[93579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcfyeifjdzlflqrumnhagiphxzdpbxvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089791.5609117-831-100651694292539/AnsiballZ_file.py'
Oct 10 09:49:52 compute-2 sudo[93579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:52.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:52 compute-2 python3.9[93581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:49:52 compute-2 sudo[93579]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:52.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:53 compute-2 ceph-mon[74913]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Oct 10 09:49:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:53 compute-2 sudo[93733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pznyscettjfoapafpvfbpeotqsbrfhbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089793.156924-876-213346748288749/AnsiballZ_dnf.py'
Oct 10 09:49:53 compute-2 sudo[93733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:53 compute-2 python3.9[93735]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:49:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:53 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:53 compute-2 sudo[93737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:49:53 compute-2 sudo[93737]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:49:53 compute-2 sudo[93737]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:53 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:54 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:49:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:54.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:49:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:54 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:54 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:49:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:54.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:54 compute-2 sudo[93733]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:55 compute-2 ceph-mon[74913]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Oct 10 09:49:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:55 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:55 compute-2 python3.9[93913]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:49:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:55 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:56 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:56.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:56 compute-2 python3.9[94066]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 10 09:49:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:57 compute-2 python3.9[94217]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:49:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:57 compute-2 ceph-mon[74913]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Oct 10 09:49:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:57 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda540030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:57 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:58 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:58.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:49:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:49:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:58.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:49:58 compute-2 sudo[94369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlekavanncnquqkoyuaclayaqnxfutja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089798.1928675-999-6747150972946/AnsiballZ_systemd.py'
Oct 10 09:49:58 compute-2 sudo[94369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:49:59 compute-2 python3.9[94371]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:49:59 compute-2 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 10 09:49:59 compute-2 systemd[1]: tuned.service: Deactivated successfully.
Oct 10 09:49:59 compute-2 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 10 09:49:59 compute-2 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 09:49:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:49:59 compute-2 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 09:49:59 compute-2 sudo[94369]: pam_unix(sudo:session): session closed for user root
Oct 10 09:49:59 compute-2 ceph-mon[74913]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 255 B/s wr, 0 op/s
Oct 10 09:49:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:59 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:49:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:49:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:59 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda540030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:00 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:00.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:00 compute-2 python3.9[94532]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 10 09:50:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:00 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 09:50:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:00.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:01 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:01 compute-2 ceph-mon[74913]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:50:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:01 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:02 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:50:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:02.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:50:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:02.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:03 compute-2 sudo[94686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebknlqitebydtdyhnizmtqykwxfmbwtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089803.203819-1169-260171017003096/AnsiballZ_systemd.py'
Oct 10 09:50:03 compute-2 sudo[94686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:03 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:03 compute-2 ceph-mon[74913]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:50:03 compute-2 python3.9[94688]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:50:03 compute-2 sudo[94686]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:03 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:04 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:04 compute-2 sudo[94840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmskmjksiexblhpeqvfigtmdhbdjxpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089803.9943323-1169-90728554659211/AnsiballZ_systemd.py'
Oct 10 09:50:04 compute-2 sudo[94840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:04.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:04 compute-2 python3.9[94842]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:50:04 compute-2 sudo[94840]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:04.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:05 compute-2 sshd-session[88563]: Connection closed by 192.168.122.30 port 59148
Oct 10 09:50:05 compute-2 sshd-session[88560]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:50:05 compute-2 systemd[1]: session-39.scope: Deactivated successfully.
Oct 10 09:50:05 compute-2 systemd[1]: session-39.scope: Consumed 1min 1.913s CPU time.
Oct 10 09:50:05 compute-2 systemd-logind[796]: Session 39 logged out. Waiting for processes to exit.
Oct 10 09:50:05 compute-2 systemd-logind[796]: Removed session 39.
Oct 10 09:50:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:05 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:05 compute-2 ceph-mon[74913]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:06 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:06 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:06.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:06.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:07 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:07 compute-2 ceph-mon[74913]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:08 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:08 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:08.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:08.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:09 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:09 compute-2 ceph-mon[74913]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:10 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:10 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:10.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:10 compute-2 sudo[94876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:50:10 compute-2 sudo[94876]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:10 compute-2 sudo[94876]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:10.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:11 compute-2 sshd-session[94902]: Accepted publickey for zuul from 192.168.122.30 port 50478 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:50:11 compute-2 systemd-logind[796]: New session 40 of user zuul.
Oct 10 09:50:11 compute-2 systemd[1]: Started Session 40 of User zuul.
Oct 10 09:50:11 compute-2 sshd-session[94902]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:50:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:11 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:11 compute-2 ceph-mon[74913]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:12 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:12 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:12 compute-2 python3.9[95055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:12.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:12.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:13 compute-2 sudo[95211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oatbwazjlxrtppyfbkxsckpghoyzifcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089812.8963149-70-250672646232212/AnsiballZ_getent.py'
Oct 10 09:50:13 compute-2 sudo[95211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:13 compute-2 python3.9[95213]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 10 09:50:13 compute-2 sudo[95211]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:13 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:13 compute-2 ceph-mon[74913]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:50:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:14 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:14 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:14.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:14 compute-2 sudo[95365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oombfmmlgnqvglofpzoxqgdxzvnwitdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089814.2245297-106-208775322075279/AnsiballZ_setup.py'
Oct 10 09:50:14 compute-2 sudo[95365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:14 compute-2 python3.9[95368]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:50:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:14 compute-2 sudo[95365]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:15 compute-2 sudo[95451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjoulbkjnvfhhahrxsnvpyeivkhyocch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089814.2245297-106-208775322075279/AnsiballZ_dnf.py'
Oct 10 09:50:15 compute-2 sudo[95451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:15 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:15 compute-2 python3.9[95453]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 09:50:15 compute-2 ceph-mon[74913]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:16 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:16 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:16.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:16.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:50:16 compute-2 sudo[95451]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:17 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:17 compute-2 sudo[95606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-savrtknyhhnpoawelwbkhuhzejacvemb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089817.4703207-148-45031900562436/AnsiballZ_dnf.py'
Oct 10 09:50:17 compute-2 sudo[95606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:17 compute-2 ceph-mon[74913]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:17 compute-2 python3.9[95608]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:50:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:18 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:18 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:18.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:19 compute-2 sudo[95606]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:19 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:19 compute-2 ceph-mon[74913]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:20 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:20 compute-2 sudo[95761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyzdpuvfzpwmvwpcxtbdqgaknhfwlbav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089819.478802-172-121088728440481/AnsiballZ_systemd.py'
Oct 10 09:50:20 compute-2 sudo[95761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:20 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:20 compute-2 python3.9[95763]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:50:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:20.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:20 compute-2 sudo[95761]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:20.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:21 compute-2 python3.9[95918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:21 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:21 compute-2 ceph-mon[74913]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:22 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:22 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:22 compute-2 sudo[96068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktwlpuafnquauavqebxkzmmrijspepsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089821.771826-227-58038548310115/AnsiballZ_sefcontext.py'
Oct 10 09:50:22 compute-2 sudo[96068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:22 compute-2 python3.9[96070]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 10 09:50:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:22.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:22 compute-2 sudo[96068]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:22 compute-2 ceph-mon[74913]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:50:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:23 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:23 compute-2 python3.9[96222]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:24 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:24 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:50:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:24.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:50:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:24 compute-2 sudo[96379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgzmsitartxomcrishjfriupycdarugf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089824.430485-280-274017448210082/AnsiballZ_dnf.py'
Oct 10 09:50:24 compute-2 sudo[96379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:24.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:24 compute-2 python3.9[96381]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:50:25 compute-2 ceph-mon[74913]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:25 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:26 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:26 compute-2 sudo[96379]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:26 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:27 compute-2 sudo[96535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayjhpgquolweahujwscdkbaitfwfrtct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089826.4896023-304-59877289245308/AnsiballZ_command.py'
Oct 10 09:50:27 compute-2 sudo[96535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:27 compute-2 python3.9[96537]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:50:27 compute-2 ceph-mon[74913]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:27 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:28 compute-2 sudo[96535]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:28.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:28.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:28 compute-2 sudo[96824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdbqttgawftchwnfxveeyhndtawlyrvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089828.502127-329-43151274812174/AnsiballZ_file.py'
Oct 10 09:50:28 compute-2 sudo[96824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:29 compute-2 python3.9[96826]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 09:50:29 compute-2 sudo[96824]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:29 compute-2 ceph-mon[74913]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:29 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:30 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:30 compute-2 python3.9[96976]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:50:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:30 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:30.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:30 compute-2 sudo[97150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anmeleviblzsjafpcgdukgiksqrljyrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089830.4011831-376-2839412401324/AnsiballZ_dnf.py'
Oct 10 09:50:30 compute-2 sudo[97150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:30 compute-2 sudo[97110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:50:30 compute-2 sudo[97110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:30 compute-2 sudo[97110]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:30.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:30 compute-2 python3.9[97154]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:50:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:31 compute-2 ceph-mon[74913]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:50:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:31 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:32 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:32 compute-2 sudo[97150]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:32 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:32 compute-2 sudo[97310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txbrvhvsmwozsvdzzuhpegmdqhuzgrke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089832.588824-403-195123447842098/AnsiballZ_dnf.py'
Oct 10 09:50:32 compute-2 sudo[97310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:33 compute-2 python3.9[97312]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:50:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:33 compute-2 ceph-mon[74913]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:50:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:33 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:34 compute-2 sudo[97310]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:34.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:35 compute-2 sudo[97465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obvfitiaberxokprpuleoxsvefbsikdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089834.8335266-439-12194619012886/AnsiballZ_stat.py'
Oct 10 09:50:35 compute-2 sudo[97465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:35 compute-2 python3.9[97467]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:50:35 compute-2 sudo[97465]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:35 compute-2 ceph-mon[74913]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:35 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:36 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:50:36 compute-2 sudo[97620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syytubqtncvmxpapvtyhcbxvrsxxvxhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089835.7710617-463-231676017153093/AnsiballZ_slurp.py'
Oct 10 09:50:36 compute-2 sudo[97620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:36 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58001040 fd 38 proxy ignored for local
Oct 10 09:50:36 compute-2 kernel: ganesha.nfsd[97494]: segfault at 50 ip 00007fdb2e01632e sp 00007fdaee7fb210 error 4 in libntirpc.so.5.8[7fdb2dffb000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 09:50:36 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 09:50:36 compute-2 systemd[1]: Started Process Core Dump (PID 97623/UID 0).
Oct 10 09:50:36 compute-2 python3.9[97622]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Oct 10 09:50:36 compute-2 sudo[97620]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:37 compute-2 systemd-coredump[97624]: Process 90723 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 56:
                                                   #0  0x00007fdb2e01632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Oct 10 09:50:37 compute-2 systemd[1]: systemd-coredump@1-97623-0.service: Deactivated successfully.
Oct 10 09:50:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:37 compute-2 sshd-session[94905]: Connection closed by 192.168.122.30 port 50478
Oct 10 09:50:37 compute-2 sshd-session[94902]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:50:37 compute-2 systemd[1]: session-40.scope: Deactivated successfully.
Oct 10 09:50:37 compute-2 systemd[1]: session-40.scope: Consumed 17.687s CPU time.
Oct 10 09:50:37 compute-2 systemd-logind[796]: Session 40 logged out. Waiting for processes to exit.
Oct 10 09:50:37 compute-2 systemd-logind[796]: Removed session 40.
Oct 10 09:50:37 compute-2 podman[97655]: 2025-10-10 09:50:37.395284857 +0000 UTC m=+0.030011196 container died dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Oct 10 09:50:37 compute-2 systemd[1]: var-lib-containers-storage-overlay-1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da-merged.mount: Deactivated successfully.
Oct 10 09:50:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:37 compute-2 podman[97655]: 2025-10-10 09:50:37.433952286 +0000 UTC m=+0.068678605 container remove dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Oct 10 09:50:37 compute-2 ceph-mon[74913]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:37 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 09:50:37 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 09:50:37 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.232s CPU time.
Oct 10 09:50:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:38.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:38.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:39 compute-2 ceph-mon[74913]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:40.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:40.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:41 compute-2 ceph-mon[74913]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:50:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095042 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:50:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:42.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:42.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:43 compute-2 sshd-session[97705]: Accepted publickey for zuul from 192.168.122.30 port 37094 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:50:43 compute-2 systemd-logind[796]: New session 41 of user zuul.
Oct 10 09:50:43 compute-2 systemd[1]: Started Session 41 of User zuul.
Oct 10 09:50:43 compute-2 sshd-session[97705]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:50:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:43 compute-2 ceph-mon[74913]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:50:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:44.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:44 compute-2 python3.9[97858]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:44.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:45 compute-2 ceph-mon[74913]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:50:45 compute-2 python3.9[98014]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:50:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:46.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:50:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:46.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:46 compute-2 python3.9[98208]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:50:47 compute-2 sshd-session[97708]: Connection closed by 192.168.122.30 port 37094
Oct 10 09:50:47 compute-2 sshd-session[97705]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:50:47 compute-2 systemd[1]: session-41.scope: Deactivated successfully.
Oct 10 09:50:47 compute-2 systemd[1]: session-41.scope: Consumed 2.431s CPU time.
Oct 10 09:50:47 compute-2 systemd-logind[796]: Session 41 logged out. Waiting for processes to exit.
Oct 10 09:50:47 compute-2 systemd-logind[796]: Removed session 41.
Oct 10 09:50:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:47 compute-2 ceph-mon[74913]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:50:47 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 2.
Oct 10 09:50:47 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:50:47 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.232s CPU time.
Oct 10 09:50:47 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:50:47 compute-2 podman[98282]: 2025-10-10 09:50:47.976968245 +0000 UTC m=+0.044330913 container create ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Oct 10 09:50:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:50:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:50:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:50:48 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:50:48 compute-2 podman[98282]: 2025-10-10 09:50:48.049521647 +0000 UTC m=+0.116884345 container init ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Oct 10 09:50:48 compute-2 podman[98282]: 2025-10-10 09:50:47.955859252 +0000 UTC m=+0.023221960 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:50:48 compute-2 podman[98282]: 2025-10-10 09:50:48.056789159 +0000 UTC m=+0.124151837 container start ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:50:48 compute-2 bash[98282]: ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec
Oct 10 09:50:48 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:48.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:48.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:49 compute-2 ceph-mon[74913]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:50:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:50.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:50.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:50 compute-2 sudo[98343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:50:50 compute-2 sudo[98343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:50 compute-2 sudo[98343]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:51 compute-2 ceph-mon[74913]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:50:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:52.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:50:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:52.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:50:53 compute-2 sshd-session[98371]: Accepted publickey for zuul from 192.168.122.30 port 42478 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:50:53 compute-2 systemd-logind[796]: New session 42 of user zuul.
Oct 10 09:50:53 compute-2 systemd[1]: Started Session 42 of User zuul.
Oct 10 09:50:53 compute-2 sshd-session[98371]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:50:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:53 compute-2 ceph-mon[74913]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:50:53 compute-2 sudo[98451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:50:53 compute-2 sudo[98451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:53 compute-2 sudo[98451]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:54 compute-2 sudo[98499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:50:54 compute-2 sudo[98499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:50:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:50:54 compute-2 python3.9[98574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:54.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:54 compute-2 sudo[98499]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:54.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:50:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:55 compute-2 python3.9[98762]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:50:55 compute-2 ceph-mon[74913]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:50:55 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:50:56 compute-2 sudo[98916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypapqcvgvxkssvisqceetrdktbzudrqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089855.9382915-82-210288488366205/AnsiballZ_setup.py'
Oct 10 09:50:56 compute-2 sudo[98916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:56.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:56 compute-2 python3.9[98918]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:50:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:50:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:56.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:50:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:57 compute-2 ceph-mon[74913]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:50:57 compute-2 sudo[98916]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:57 compute-2 sudo[99002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxtppvzntxqrqetwevvibbomibqfntvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089855.9382915-82-210288488366205/AnsiballZ_dnf.py'
Oct 10 09:50:57 compute-2 sudo[99002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:58 compute-2 python3.9[99004]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:50:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:50:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:58.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:50:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:50:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 09:50:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:58.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 09:50:59 compute-2 sudo[99002]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:50:59 compute-2 ceph-mon[74913]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:50:59 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:50:59 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:50:59 compute-2 sudo[99079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:50:59 compute-2 sudo[99079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:50:59 compute-2 sudo[99079]: pam_unix(sudo:session): session closed for user root
Oct 10 09:50:59 compute-2 sudo[99182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouabnkyjezisdnfgilbxglbeczyvduqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089859.5699658-118-179981673184281/AnsiballZ_setup.py'
Oct 10 09:50:59 compute-2 sudo[99182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:50:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:00 compute-2 python3.9[99184]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:00 compute-2 sudo[99182]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:00.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:00.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:01 compute-2 sudo[99395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgnqaapfdvatevdgradbxbfxenniagvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089860.8904896-151-278067472762818/AnsiballZ_file.py'
Oct 10 09:51:01 compute-2 sudo[99395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:01 compute-2 python3.9[99397]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:01 compute-2 sudo[99395]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:01 compute-2 ceph-mon[74913]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:51:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:51:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095102 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:51:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:02 compute-2 sudo[99547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rncimvsngbwfrclbyevfrgqyhusnunap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089861.9053113-175-95099234198857/AnsiballZ_command.py'
Oct 10 09:51:02 compute-2 sudo[99547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:02.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:02 compute-2 sshd-session[70470]: Received disconnect from 38.102.83.82 port 56466:11: disconnected by user
Oct 10 09:51:02 compute-2 sshd-session[70470]: Disconnected from user zuul 38.102.83.82 port 56466
Oct 10 09:51:02 compute-2 sshd-session[70467]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:51:02 compute-2 systemd[1]: session-20.scope: Deactivated successfully.
Oct 10 09:51:02 compute-2 systemd[1]: session-20.scope: Consumed 8.834s CPU time.
Oct 10 09:51:02 compute-2 systemd-logind[796]: Session 20 logged out. Waiting for processes to exit.
Oct 10 09:51:02 compute-2 systemd-logind[796]: Removed session 20.
Oct 10 09:51:02 compute-2 python3.9[99549]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:51:02 compute-2 sudo[99547]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:02.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:03 compute-2 sudo[99714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnvhqjhrfvaekzduyixjajdwisosnccc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089862.9604795-200-205214730111895/AnsiballZ_stat.py'
Oct 10 09:51:03 compute-2 sudo[99714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:03 compute-2 python3.9[99716]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:03 compute-2 sudo[99714]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:03 compute-2 ceph-mon[74913]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:51:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:03 compute-2 sudo[99792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdxgqvkryhvbcvcaannggujtkcsvetux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089862.9604795-200-205214730111895/AnsiballZ_file.py'
Oct 10 09:51:03 compute-2 sudo[99792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:04 compute-2 python3.9[99794]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:04 compute-2 sudo[99792]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:04 compute-2 sudo[99945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbrpgqatkilaxlhirzulckvyammwrufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089864.446701-236-34621800360020/AnsiballZ_stat.py'
Oct 10 09:51:04 compute-2 sudo[99945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 09:51:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:04.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 09:51:04 compute-2 python3.9[99947]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:04 compute-2 sudo[99945]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:05 compute-2 sudo[100024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xduwbiistmfcucpbvlzfnduimmzjkzgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089864.446701-236-34621800360020/AnsiballZ_file.py'
Oct 10 09:51:05 compute-2 sudo[100024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:05 compute-2 python3.9[100026]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:05 compute-2 sudo[100024]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:05 compute-2 ceph-mon[74913]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:51:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:06 compute-2 sudo[100176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azupbvpystkssyikmzdzqqyvtwkxwldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089865.6977148-274-63934230543088/AnsiballZ_ini_file.py'
Oct 10 09:51:06 compute-2 sudo[100176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:06 compute-2 python3.9[100178]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:06 compute-2 sudo[100176]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:06.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:06 compute-2 sudo[100329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-netdjptpmwqpqsfkrrblylaiahzsayrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089866.460309-274-166851761907179/AnsiballZ_ini_file.py'
Oct 10 09:51:06 compute-2 sudo[100329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:06.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:06 compute-2 python3.9[100331]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:06 compute-2 sudo[100329]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:07 compute-2 sudo[100482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-monzrwwyvxuttatwksvxyfeebnnewzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089867.0406735-274-200119585981036/AnsiballZ_ini_file.py'
Oct 10 09:51:07 compute-2 sudo[100482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:07 compute-2 python3.9[100484]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:07 compute-2 sudo[100482]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:07 compute-2 ceph-mon[74913]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:51:07 compute-2 sudo[100634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxgjwxjydhhnsqdwofckcpzxyaibdeow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089867.5985227-274-13386233333430/AnsiballZ_ini_file.py'
Oct 10 09:51:07 compute-2 sudo[100634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:08 compute-2 python3.9[100636]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:08 compute-2 sudo[100634]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:08.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:08.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:09 compute-2 sudo[100788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmsrgycujlolvancfxlfiockjmkholxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089869.144189-367-215782147769592/AnsiballZ_dnf.py'
Oct 10 09:51:09 compute-2 sudo[100788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:09 compute-2 python3.9[100790]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:51:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:09 compute-2 ceph-mon[74913]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:51:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:10.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:10.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:10 compute-2 sudo[100794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:51:10 compute-2 sudo[100794]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:51:10 compute-2 sudo[100794]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:10 compute-2 sudo[100788]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095111 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:51:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:11 compute-2 ceph-mon[74913]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:51:11 compute-2 sudo[100968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqgnkxxgzadetfvvxeglhpsbtnubqbrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089871.584866-400-74286265390848/AnsiballZ_setup.py'
Oct 10 09:51:11 compute-2 sudo[100968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:12 compute-2 python3.9[100970]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:51:12 compute-2 sudo[100968]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:12.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:12.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:13 compute-2 sudo[101124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyylxocfceyyvshsmpayubtaywtwrduu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089872.7457736-424-263571265631477/AnsiballZ_stat.py'
Oct 10 09:51:13 compute-2 sudo[101124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:13 compute-2 python3.9[101126]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:51:13 compute-2 sudo[101124]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:13 compute-2 ceph-mon[74913]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:51:13 compute-2 sudo[101276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbjaaonhweurdbdnuwrpyuidncvxyapm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089873.562085-451-164235460291227/AnsiballZ_stat.py'
Oct 10 09:51:13 compute-2 sudo[101276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:14 compute-2 python3.9[101278]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:51:14 compute-2 sudo[101276]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:14.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:14.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:14 compute-2 sudo[101430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oepdmdnqxgphggvgrnwwdsdfqelckqnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089874.5303805-481-2951725843953/AnsiballZ_service_facts.py'
Oct 10 09:51:14 compute-2 sudo[101430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:15 compute-2 python3.9[101432]: ansible-service_facts Invoked
Oct 10 09:51:15 compute-2 network[101449]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:51:15 compute-2 network[101450]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:51:15 compute-2 network[101451]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:51:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:15 compute-2 ceph-mon[74913]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:51:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000066s ======
Oct 10 09:51:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:16.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000066s
Oct 10 09:51:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:51:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:17 compute-2 ceph-mon[74913]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:51:17 compute-2 sudo[101430]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:18.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:19 compute-2 ceph-mon[74913]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:51:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:51:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:20 compute-2 sudo[101741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhyhxszpankxvsgmzcrjvjoxdxuexvxj ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1760089880.0327172-521-199372434381714/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1760089880.0327172-521-199372434381714/args'
Oct 10 09:51:20 compute-2 sudo[101741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:20 compute-2 sudo[101741]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:20.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:21 compute-2 sudo[101910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvbhxissfjjakunuscpethqdmirzulds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089880.9286358-554-191269478561188/AnsiballZ_dnf.py'
Oct 10 09:51:21 compute-2 sudo[101910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:21 compute-2 ceph-mon[74913]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:51:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:21 compute-2 python3.9[101912]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:51:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:22.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:22 compute-2 sudo[101910]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:22.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:51:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:51:23 compute-2 ceph-mon[74913]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:51:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:23 compute-2 sudo[102065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ychrmyoxcsgticyyzeduqxiogbyqbifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089883.2737503-593-58370088485825/AnsiballZ_package_facts.py'
Oct 10 09:51:23 compute-2 sudo[102065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:24 compute-2 python3.9[102067]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 10 09:51:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:24.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:24 compute-2 sudo[102065]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:24.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:25 compute-2 ceph-mon[74913]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 596 B/s wr, 1 op/s
Oct 10 09:51:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:25 compute-2 sudo[102219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqkhipldcyrusstnrtxqfaiopilrhxdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089885.271397-623-73857802308369/AnsiballZ_stat.py'
Oct 10 09:51:25 compute-2 sudo[102219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:25 compute-2 python3.9[102221]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:25 compute-2 sudo[102219]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:26 compute-2 sudo[102297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxjvokfrkyhrcfhzboiemzfmxktfaqeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089885.271397-623-73857802308369/AnsiballZ_file.py'
Oct 10 09:51:26 compute-2 sudo[102297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:26 compute-2 python3.9[102299]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:26 compute-2 sudo[102297]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:51:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:26.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:26.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:27 compute-2 sudo[102451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokiyyacsjxrrobjgrbvkhybeshgstmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089886.7293003-660-29943809713726/AnsiballZ_stat.py'
Oct 10 09:51:27 compute-2 sudo[102451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:27 compute-2 python3.9[102453]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:27 compute-2 sudo[102451]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:27 compute-2 ceph-mon[74913]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:51:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:27 compute-2 sudo[102529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwhgsmeumrmydvccchfervtzcaaahjqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089886.7293003-660-29943809713726/AnsiballZ_file.py'
Oct 10 09:51:27 compute-2 sudo[102529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:27 compute-2 python3.9[102531]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:27 compute-2 sudo[102529]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:28.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:28.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:29 compute-2 sudo[102683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbtlkshwxmwaqnqmqsofcqllivdpaokn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089888.7597136-714-216695051588343/AnsiballZ_lineinfile.py'
Oct 10 09:51:29 compute-2 sudo[102683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:29 compute-2 ceph-mon[74913]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:51:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:29 compute-2 python3.9[102685]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:29 compute-2 sudo[102683]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:30.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:30 compute-2 sudo[102837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlyftgwywrkykgmbjiguscpfvhpfmjfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089890.548202-759-21114785920432/AnsiballZ_setup.py'
Oct 10 09:51:30 compute-2 sudo[102837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:30.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:30 compute-2 sudo[102840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:51:30 compute-2 sudo[102840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:51:30 compute-2 sudo[102840]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:31 compute-2 python3.9[102839]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:51:31 compute-2 sudo[102837]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:31 compute-2 ceph-mon[74913]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 09:51:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:51:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095131 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:51:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:31 compute-2 sudo[102948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwlbhaajccknjhdsqhsojudxapwhiswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089890.548202-759-21114785920432/AnsiballZ_systemd.py'
Oct 10 09:51:31 compute-2 sudo[102948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:32 compute-2 python3.9[102950]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:51:32 compute-2 sudo[102948]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:32.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:33 compute-2 sshd-session[98374]: Connection closed by 192.168.122.30 port 42478
Oct 10 09:51:33 compute-2 sshd-session[98371]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:51:33 compute-2 systemd[1]: session-42.scope: Deactivated successfully.
Oct 10 09:51:33 compute-2 systemd[1]: session-42.scope: Consumed 23.061s CPU time.
Oct 10 09:51:33 compute-2 systemd-logind[796]: Session 42 logged out. Waiting for processes to exit.
Oct 10 09:51:33 compute-2 systemd-logind[796]: Removed session 42.
Oct 10 09:51:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:33 compute-2 ceph-mon[74913]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:51:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:34.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:34.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:35 compute-2 ceph-mon[74913]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:51:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:36.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:36.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:37 compute-2 ceph-mon[74913]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:51:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:38.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:38.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:38 compute-2 sshd-session[102985]: Accepted publickey for zuul from 192.168.122.30 port 37130 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:51:38 compute-2 systemd-logind[796]: New session 43 of user zuul.
Oct 10 09:51:38 compute-2 systemd[1]: Started Session 43 of User zuul.
Oct 10 09:51:38 compute-2 sshd-session[102985]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:51:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:39 compute-2 ceph-mon[74913]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 09:51:39 compute-2 sudo[103138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbfsxcdkpmshnvextkrpwmbzpsphqabe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089899.0912032-28-103084863621190/AnsiballZ_file.py'
Oct 10 09:51:39 compute-2 sudo[103138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:39 compute-2 python3.9[103140]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:39 compute-2 sudo[103138]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:40.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:40 compute-2 sudo[103291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdrfdwsvhuyftemmzouyeqlimutqcrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089900.1362793-65-201477305413841/AnsiballZ_stat.py'
Oct 10 09:51:40 compute-2 sudo[103291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:40 compute-2 python3.9[103293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:40 compute-2 sudo[103291]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:40.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095141 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:51:41 compute-2 sudo[103370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeffcnixsucskkditstprfiznlmtdsaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089900.1362793-65-201477305413841/AnsiballZ_file.py'
Oct 10 09:51:41 compute-2 sudo[103370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:41 compute-2 python3.9[103372]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:41 compute-2 sudo[103370]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:41 compute-2 ceph-mon[74913]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:51:41 compute-2 sshd-session[102988]: Connection closed by 192.168.122.30 port 37130
Oct 10 09:51:41 compute-2 sshd-session[102985]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:51:41 compute-2 systemd[1]: session-43.scope: Deactivated successfully.
Oct 10 09:51:41 compute-2 systemd[1]: session-43.scope: Consumed 1.628s CPU time.
Oct 10 09:51:41 compute-2 systemd-logind[796]: Session 43 logged out. Waiting for processes to exit.
Oct 10 09:51:41 compute-2 systemd-logind[796]: Removed session 43.
Oct 10 09:51:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:42.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:43 compute-2 ceph-mon[74913]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:51:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:44.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:45 compute-2 ceph-mon[74913]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:51:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:46.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:51:46 compute-2 sshd-session[103402]: Accepted publickey for zuul from 192.168.122.30 port 52590 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:51:46 compute-2 systemd-logind[796]: New session 44 of user zuul.
Oct 10 09:51:46 compute-2 systemd[1]: Started Session 44 of User zuul.
Oct 10 09:51:46 compute-2 sshd-session[103402]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:51:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:46.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:47 compute-2 ceph-mon[74913]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:51:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:47 compute-2 python3.9[103556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:51:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:48.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:48 compute-2 sudo[103712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcburfeqxaqwpguoplwwtrlqykhnifda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089908.3927493-61-242839535133029/AnsiballZ_file.py'
Oct 10 09:51:48 compute-2 sudo[103712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:48.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:49 compute-2 python3.9[103714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:49 compute-2 sudo[103712]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:49 compute-2 ceph-mon[74913]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:51:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:49 compute-2 sudo[103887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbvkhhwbzzbrhycmwdvrbyyeginfcwht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089909.3709207-85-22061183667721/AnsiballZ_stat.py'
Oct 10 09:51:49 compute-2 sudo[103887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:50 compute-2 python3.9[103889]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:51:50 compute-2 sudo[103887]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:50 compute-2 sudo[103965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyhwvjogpfbnunhxnjkvdpgmftxrqlmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089909.3709207-85-22061183667721/AnsiballZ_file.py'
Oct 10 09:51:50 compute-2 sudo[103965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:50.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:50 compute-2 python3.9[103967]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.s8f14h_c recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:50 compute-2 sudo[103965]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:50.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:51 compute-2 sudo[103994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:51:51 compute-2 sudo[103994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:51:51 compute-2 sudo[103994]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:51 compute-2 sudo[104144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veityxxgtgzmvedooaqzvxqpoedbwlwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089911.2197568-145-267346452745151/AnsiballZ_stat.py'
Oct 10 09:51:51 compute-2 sudo[104144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:51 compute-2 ceph-mon[74913]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:51:51 compute-2 python3.9[104146]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:51 compute-2 sudo[104144]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:51 compute-2 sudo[104222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjbpbjrlobimhordipytfdphepklrpio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089911.2197568-145-267346452745151/AnsiballZ_file.py'
Oct 10 09:51:51 compute-2 sudo[104222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:52 compute-2 python3.9[104224]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.plz3066q recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:52 compute-2 sudo[104222]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:52.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:52.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:52 compute-2 sudo[104376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azigjmgqefpwnonzeletdniefffyyhqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089912.6468945-184-156799583707232/AnsiballZ_file.py'
Oct 10 09:51:52 compute-2 sudo[104376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:53 compute-2 python3.9[104378]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:51:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:51:53 compute-2 sudo[104376]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:53 compute-2 ceph-mon[74913]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:51:53 compute-2 sudo[104528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hozaoierpmxirulqayjirnsjeqbbzrlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089913.423041-208-6511007866671/AnsiballZ_stat.py'
Oct 10 09:51:53 compute-2 sudo[104528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:53 compute-2 python3.9[104530]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:54 compute-2 sudo[104528]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:54 compute-2 sudo[104606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thkrxuaremozxnzpbubxtphcvliwidea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089913.423041-208-6511007866671/AnsiballZ_file.py'
Oct 10 09:51:54 compute-2 sudo[104606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:54 compute-2 python3.9[104608]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:54 compute-2 sudo[104606]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:54.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:54 compute-2 sudo[104760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiufqidhzpsciaadgndydqavelglgbcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089914.5730271-208-161844886017502/AnsiballZ_stat.py'
Oct 10 09:51:54 compute-2 sudo[104760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:51:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:54.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:51:55 compute-2 python3.9[104762]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:55 compute-2 sudo[104760]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:55 compute-2 sudo[104838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gltemiyfyzufgnplnxvzflqefhosffjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089914.5730271-208-161844886017502/AnsiballZ_file.py'
Oct 10 09:51:55 compute-2 sudo[104838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:55 compute-2 python3.9[104840]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:51:55 compute-2 sudo[104838]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:55 compute-2 ceph-mon[74913]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:51:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:51:56 compute-2 sudo[104990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbmwzkdtowspulteaachghegjhdzomw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089916.0193963-278-189083102893133/AnsiballZ_file.py'
Oct 10 09:51:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:56 compute-2 sudo[104990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:56.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:56 compute-2 python3.9[104992]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:56 compute-2 sudo[104990]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:56.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:57 compute-2 sudo[105144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jziazembnwobkwpnffredetdcvkbdoma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089916.8383045-302-181482282231970/AnsiballZ_stat.py'
Oct 10 09:51:57 compute-2 sudo[105144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:57 compute-2 python3.9[105146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:57 compute-2 sudo[105144]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:57 compute-2 sudo[105222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyjsiymaocphdixoldzwppzifycalcme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089916.8383045-302-181482282231970/AnsiballZ_file.py'
Oct 10 09:51:57 compute-2 sudo[105222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:57 compute-2 python3.9[105224]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:57 compute-2 ceph-mon[74913]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:51:57 compute-2 sudo[105222]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:58 compute-2 sudo[105374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elmgbtklcmusjabvttrlwdghktdcjegg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089918.1334338-338-127140869390650/AnsiballZ_stat.py'
Oct 10 09:51:58 compute-2 sudo[105374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:51:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:58.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:51:58 compute-2 python3.9[105376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:51:58 compute-2 sudo[105374]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:58 compute-2 sudo[105454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvvwhwdrdgtotvkaccpasenkkvmmzfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089918.1334338-338-127140869390650/AnsiballZ_file.py'
Oct 10 09:51:58 compute-2 sudo[105454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:51:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:51:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:51:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:58.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:51:59 compute-2 python3.9[105456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:51:59 compute-2 sudo[105454]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:51:59 compute-2 systemd[82041]: Created slice User Background Tasks Slice.
Oct 10 09:51:59 compute-2 systemd[82041]: Starting Cleanup of User's Temporary Files and Directories...
Oct 10 09:51:59 compute-2 ceph-mon[74913]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:51:59 compute-2 systemd[82041]: Finished Cleanup of User's Temporary Files and Directories.
Oct 10 09:51:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:51:59 compute-2 sudo[105535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:51:59 compute-2 sudo[105535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:51:59 compute-2 sudo[105535]: pam_unix(sudo:session): session closed for user root
Oct 10 09:51:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:51:59 compute-2 sudo[105563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:51:59 compute-2 sudo[105563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:52:00 compute-2 sudo[105658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wulldftusuewurntdroofkliqydqlfsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089919.4670527-373-155109192438505/AnsiballZ_systemd.py'
Oct 10 09:52:00 compute-2 sudo[105658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:00 compute-2 python3.9[105660]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:52:00 compute-2 systemd[1]: Reloading.
Oct 10 09:52:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:00 compute-2 sudo[105563]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:00.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:00 compute-2 systemd-rc-local-generator[105718]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:52:00 compute-2 systemd-sysv-generator[105721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:52:00 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:52:00 compute-2 sudo[105658]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:00.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:01 compute-2 sudo[105880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otxzmiokgcnynwagxaipflwkeswvqrcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089920.9613972-398-26258865435707/AnsiballZ_stat.py'
Oct 10 09:52:01 compute-2 sudo[105880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:01 compute-2 python3.9[105882]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:01 compute-2 sudo[105880]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:01 compute-2 sudo[105958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-welugzkejsowlpwhwmexqnsuvrcuouev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089920.9613972-398-26258865435707/AnsiballZ_file.py'
Oct 10 09:52:01 compute-2 sudo[105958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:01 compute-2 ceph-mon[74913]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 09:52:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:52:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:01 compute-2 python3.9[105960]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:01 compute-2 sudo[105958]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:52:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:02.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:52:02 compute-2 sudo[106112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtyxvfvyssmlvlfgdxpoenizlbaathgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089922.3116884-434-20281604741729/AnsiballZ_stat.py'
Oct 10 09:52:02 compute-2 sudo[106112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:02 compute-2 python3.9[106114]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:02 compute-2 sudo[106112]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:52:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:02.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:52:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095203 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:52:03 compute-2 sudo[106191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfvyazejdnogcrtpkrrynpucbntiwwdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089922.3116884-434-20281604741729/AnsiballZ_file.py'
Oct 10 09:52:03 compute-2 sudo[106191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:03 compute-2 python3.9[106193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:03 compute-2 sudo[106191]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:03 compute-2 ceph-mon[74913]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:52:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:03 compute-2 sudo[106344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diutdrykojbehkldhlzrqsjxxynsycae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089923.5044637-469-30102274701358/AnsiballZ_systemd.py'
Oct 10 09:52:03 compute-2 sudo[106344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:04 compute-2 python3.9[106346]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:52:04 compute-2 systemd[1]: Reloading.
Oct 10 09:52:04 compute-2 systemd-rc-local-generator[106373]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:52:04 compute-2 systemd-sysv-generator[106376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:52:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:04 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 09:52:04 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 09:52:04 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 09:52:04 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 09:52:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:04.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:04 compute-2 sudo[106344]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:04.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:05 compute-2 python3.9[106539]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:52:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:05 compute-2 network[106556]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:52:05 compute-2 network[106557]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:52:05 compute-2 network[106558]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:52:05 compute-2 ceph-mon[74913]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:52:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:06.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:06 compute-2 sudo[106597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:52:06 compute-2 sudo[106597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:52:06 compute-2 sudo[106597]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:06.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:07 compute-2 ceph-mon[74913]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:52:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:52:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:52:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:08.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:08.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:09 compute-2 ceph-mon[74913]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 09:52:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:10 compute-2 sudo[106850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prfehkgtoggiuzspbjvbttqkbvmzgfci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089930.06908-548-213290768831489/AnsiballZ_stat.py'
Oct 10 09:52:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:10 compute-2 sudo[106850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:10.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:10 compute-2 python3.9[106852]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:10 compute-2 sudo[106850]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:10 compute-2 sudo[106930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxbrxbarvsysugkvospblfgurunqplwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089930.06908-548-213290768831489/AnsiballZ_file.py'
Oct 10 09:52:10 compute-2 sudo[106930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:52:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:52:11 compute-2 python3.9[106932]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:11 compute-2 sudo[106930]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:11 compute-2 sudo[106941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:52:11 compute-2 sudo[106941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:52:11 compute-2 sudo[106941]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:11 compute-2 ceph-mon[74913]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:52:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:11 compute-2 sudo[107107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drtgifuoxlofhialthmbxilrqnygxwez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089931.4883885-587-220394229048683/AnsiballZ_file.py'
Oct 10 09:52:11 compute-2 sudo[107107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:12 compute-2 python3.9[107109]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:12 compute-2 sudo[107107]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:12.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:12 compute-2 sudo[107260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glgscegxzhnwwqehoayoeepaiinkglup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089932.3197799-611-41532926617665/AnsiballZ_stat.py'
Oct 10 09:52:12 compute-2 sudo[107260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:12 compute-2 python3.9[107262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:12 compute-2 sudo[107260]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:12.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:13 compute-2 sudo[107339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrqanfnkkotwndkmwayptopezntjbbcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089932.3197799-611-41532926617665/AnsiballZ_file.py'
Oct 10 09:52:13 compute-2 sudo[107339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:13 compute-2 python3.9[107341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:13 compute-2 sudo[107339]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:13 compute-2 ceph-mon[74913]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:52:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:14 compute-2 sudo[107491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvhspplfmkauwnsewnjgftvibvxiweol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089933.7993834-656-220798213698297/AnsiballZ_timezone.py'
Oct 10 09:52:14 compute-2 sudo[107491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:14 compute-2 python3.9[107493]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 09:52:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:14 compute-2 systemd[1]: Starting Time & Date Service...
Oct 10 09:52:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:14.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:14 compute-2 systemd[1]: Started Time & Date Service.
Oct 10 09:52:14 compute-2 sudo[107491]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:14.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:15 compute-2 sudo[107649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdaaoregdkcuqdsejaxnhjngocckioeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089935.0944552-683-280622175931188/AnsiballZ_file.py'
Oct 10 09:52:15 compute-2 sudo[107649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:15 compute-2 ceph-mon[74913]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:15 compute-2 python3.9[107651]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:15 compute-2 sudo[107649]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:16 compute-2 sudo[107801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shlwrvmmyhzytssegnolonvpkcwlgrwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089935.9483993-707-141920618132850/AnsiballZ_stat.py'
Oct 10 09:52:16 compute-2 sudo[107801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:16 compute-2 python3.9[107803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:16 compute-2 sudo[107801]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:16.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:52:16 compute-2 sudo[107880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuaiuuoeienmkfgbscibjbrsmxhkjnez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089935.9483993-707-141920618132850/AnsiballZ_file.py'
Oct 10 09:52:16 compute-2 sudo[107880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:16 compute-2 python3.9[107882]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:16 compute-2 sudo[107880]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:16.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:17 compute-2 sudo[108033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwywdcdmwhznrsfoxlwknzpcimhylyqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089937.2661886-744-194343350792465/AnsiballZ_stat.py'
Oct 10 09:52:17 compute-2 sudo[108033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:17 compute-2 ceph-mon[74913]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:17 compute-2 python3.9[108035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:17 compute-2 sudo[108033]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:17 compute-2 sudo[108111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqgxnfxgqndesijvxpdfjsszphyadrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089937.2661886-744-194343350792465/AnsiballZ_file.py'
Oct 10 09:52:17 compute-2 sudo[108111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:18 compute-2 python3.9[108113]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e7dgzvna recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:18 compute-2 sudo[108111]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:18.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:18 compute-2 sudo[108265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzewormoetzzcxbozqtisdyflfcvwcfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089938.594986-779-208325971723866/AnsiballZ_stat.py'
Oct 10 09:52:18 compute-2 sudo[108265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:18.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:19 compute-2 python3.9[108267]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:19 compute-2 sudo[108265]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:19 compute-2 sudo[108343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bomnlhxlclencnjbpljpgjsklnycwvbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089938.594986-779-208325971723866/AnsiballZ_file.py'
Oct 10 09:52:19 compute-2 sudo[108343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:19 compute-2 python3.9[108345]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:19 compute-2 sudo[108343]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:19 compute-2 ceph-mon[74913]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:52:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:20 compute-2 sudo[108495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxjxmhhjlumucjptpuwnklrnhmkrimqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089940.0165653-818-34722119403327/AnsiballZ_command.py'
Oct 10 09:52:20 compute-2 sudo[108495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:20.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:20 compute-2 python3.9[108497]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:52:20 compute-2 sudo[108495]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:20.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:21 compute-2 sudo[108650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkdnsrjnmixwcldoewbzsadhxkjcjnkv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760089940.9979722-842-27455484603244/AnsiballZ_edpm_nftables_from_files.py'
Oct 10 09:52:21 compute-2 sudo[108650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:21 compute-2 python3[108652]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 09:52:21 compute-2 sudo[108650]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:21 compute-2 ceph-mon[74913]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:22 compute-2 sudo[108802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdcsqlqmelauamouwkvkiwivcsvdyaid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089941.9639678-866-258941777636140/AnsiballZ_stat.py'
Oct 10 09:52:22 compute-2 sudo[108802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:22 compute-2 python3.9[108804]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:22 compute-2 sudo[108802]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:22.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:22 compute-2 sudo[108881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dikaksegsmbealoivfthylahbhibfbsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089941.9639678-866-258941777636140/AnsiballZ_file.py'
Oct 10 09:52:22 compute-2 sudo[108881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:22 compute-2 python3.9[108883]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:22 compute-2 sudo[108881]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:22.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:23 compute-2 sudo[109034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arztsfjnddosnkpqvplpvodnansqfhzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089943.324099-902-156538951346003/AnsiballZ_stat.py'
Oct 10 09:52:23 compute-2 sudo[109034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:23 compute-2 ceph-mon[74913]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:52:23 compute-2 python3.9[109036]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:23 compute-2 sudo[109034]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:24 compute-2 sudo[109112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjilnynxuzsvyslspnqredtxtzxrrznf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089943.324099-902-156538951346003/AnsiballZ_file.py'
Oct 10 09:52:24 compute-2 sudo[109112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:24 compute-2 python3.9[109114]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:24 compute-2 sudo[109112]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:24.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:24 compute-2 sudo[109267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgapawhxxezgbcokwrcgfsblhfyshuug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089944.7213721-938-105563919780156/AnsiballZ_stat.py'
Oct 10 09:52:24 compute-2 sudo[109267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:25 compute-2 python3.9[109269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:25 compute-2 sudo[109267]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:25 compute-2 sudo[109345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxfpvscmmdlfrizriimowymminnogqfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089944.7213721-938-105563919780156/AnsiballZ_file.py'
Oct 10 09:52:25 compute-2 sudo[109345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:25 compute-2 python3.9[109347]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:25 compute-2 sudo[109345]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:25 compute-2 ceph-mon[74913]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:26 compute-2 sudo[109497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsrvzshahwsgejucsahzmgubzogrotas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089946.1014996-973-181231428451328/AnsiballZ_stat.py'
Oct 10 09:52:26 compute-2 sudo[109497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:26.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:26 compute-2 python3.9[109499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:26 compute-2 sudo[109497]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:26 compute-2 sudo[109577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raorhbiyzlijxgeaybuphplsuosverst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089946.1014996-973-181231428451328/AnsiballZ_file.py'
Oct 10 09:52:26 compute-2 sudo[109577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:26.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:26 compute-2 python3.9[109579]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:27 compute-2 sudo[109577]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:27 compute-2 ceph-mon[74913]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:27 compute-2 sudo[109729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmesmkldxnlifykskvipvnvetpvcaljn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089947.4734635-1010-192147343004145/AnsiballZ_stat.py'
Oct 10 09:52:27 compute-2 sudo[109729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:28 compute-2 python3.9[109731]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:28 compute-2 sudo[109729]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:28 compute-2 sudo[109807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnyrmiocjsantmtsymxglwwxmkifdmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089947.4734635-1010-192147343004145/AnsiballZ_file.py'
Oct 10 09:52:28 compute-2 sudo[109807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:28 compute-2 python3.9[109809]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:28 compute-2 sudo[109807]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:28.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:28.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:29 compute-2 sudo[109961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mikrpqauzxixpuiscfdkrichextkdhtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089949.0612667-1049-69647457705996/AnsiballZ_command.py'
Oct 10 09:52:29 compute-2 sudo[109961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:29 compute-2 python3.9[109963]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:52:29 compute-2 sudo[109961]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:29 compute-2 ceph-mon[74913]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:52:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:30 compute-2 sudo[110116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwgzcvjbkywlgrskfskfjocvsuabyzuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089949.9292982-1073-135124720733047/AnsiballZ_blockinfile.py'
Oct 10 09:52:30 compute-2 sudo[110116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:30.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:30 compute-2 python3.9[110118]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:30 compute-2 sudo[110116]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:30.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:31 compute-2 sudo[110270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smmnzhpogvixsvjqkoignvvvppoyoehi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089950.904126-1099-223512828896284/AnsiballZ_file.py'
Oct 10 09:52:31 compute-2 sudo[110270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:31 compute-2 sudo[110273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:52:31 compute-2 sudo[110273]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:52:31 compute-2 sudo[110273]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:31 compute-2 python3.9[110272]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:31 compute-2 sudo[110270]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:31 compute-2 sudo[110447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reocfaggtnlhxxquqckuwpkitsqusrde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089951.5178673-1099-187537641134742/AnsiballZ_file.py'
Oct 10 09:52:31 compute-2 sudo[110447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:31 compute-2 ceph-mon[74913]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:52:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:31 compute-2 python3.9[110449]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:32 compute-2 sudo[110447]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:32.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:32 compute-2 sudo[110601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzsmqdtzartoclxjzhxqosyktngqtwmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089952.449643-1145-3189797231873/AnsiballZ_mount.py'
Oct 10 09:52:32 compute-2 sudo[110601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:32.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:33 compute-2 python3.9[110603]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 09:52:33 compute-2 sudo[110601]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:33 compute-2 sudo[110753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pugwvdfaunqqaedzetqcqaesmiltpuxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089953.2214491-1145-111101129200734/AnsiballZ_mount.py'
Oct 10 09:52:33 compute-2 sudo[110753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:33 compute-2 python3.9[110755]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 09:52:33 compute-2 sudo[110753]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:33 compute-2 ceph-mon[74913]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:52:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:34 compute-2 sshd-session[103406]: Connection closed by 192.168.122.30 port 52590
Oct 10 09:52:34 compute-2 sshd-session[103402]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:52:34 compute-2 systemd[1]: session-44.scope: Deactivated successfully.
Oct 10 09:52:34 compute-2 systemd[1]: session-44.scope: Consumed 29.547s CPU time.
Oct 10 09:52:34 compute-2 systemd-logind[796]: Session 44 logged out. Waiting for processes to exit.
Oct 10 09:52:34 compute-2 systemd-logind[796]: Removed session 44.
Oct 10 09:52:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:34.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:34.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:35 compute-2 ceph-mon[74913]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:36.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:36.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:37 compute-2 ceph-mon[74913]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168001380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:38.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:39 compute-2 ceph-mon[74913]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:52:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:40 compute-2 sshd-session[110786]: Accepted publickey for zuul from 192.168.122.30 port 56556 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:52:40 compute-2 systemd-logind[796]: New session 45 of user zuul.
Oct 10 09:52:40 compute-2 systemd[1]: Started Session 45 of User zuul.
Oct 10 09:52:40 compute-2 sshd-session[110786]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:52:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:40.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:40 compute-2 sudo[110941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsshsldbggbzmqdlzfirvyzldyftlfut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089960.3206642-20-205414823499257/AnsiballZ_tempfile.py'
Oct 10 09:52:40 compute-2 sudo[110941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:41 compute-2 python3.9[110943]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 10 09:52:41 compute-2 sudo[110941]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:41 compute-2 sudo[111093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpcglkjxiyvshxsqokripzquswflengu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089961.287992-57-266580254705594/AnsiballZ_stat.py'
Oct 10 09:52:41 compute-2 sudo[111093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:41 compute-2 python3.9[111095]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:52:41 compute-2 ceph-mon[74913]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:41 compute-2 sudo[111093]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:42 compute-2 sudo[111248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayhtjilzxrpfablbvrrajiwvigjiwdln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089962.234539-80-272652759756315/AnsiballZ_slurp.py'
Oct 10 09:52:42 compute-2 sudo[111248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:42 compute-2 python3.9[111250]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Oct 10 09:52:42 compute-2 sudo[111248]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:42 compute-2 ceph-mon[74913]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:52:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:43 compute-2 sudo[111401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtfwscfawxtwbewdkxlmfigzhxobllpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089963.1590438-105-234758570087073/AnsiballZ_stat.py'
Oct 10 09:52:43 compute-2 sudo[111401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:43 compute-2 python3.9[111403]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.6f632o26 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:52:43 compute-2 sudo[111401]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168009720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:44 compute-2 sudo[111526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjwyzxosqiaduwdetsgwupgqaipewqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089963.1590438-105-234758570087073/AnsiballZ_copy.py'
Oct 10 09:52:44 compute-2 sudo[111526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:44 compute-2 python3.9[111528]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.6f632o26 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089963.1590438-105-234758570087073/.source.6f632o26 _original_basename=.uep4gh6k follow=False checksum=2d908d3ce99ab235b2c2751c9a38992c3c685672 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:44 compute-2 sudo[111526]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:44.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:44 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 09:52:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:45 compute-2 ceph-mon[74913]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:45 compute-2 sudo[111682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iflxjhgnfvcseferjtenppnjnfsprind ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089964.8397036-149-117826550486600/AnsiballZ_setup.py'
Oct 10 09:52:45 compute-2 sudo[111682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:45 compute-2 python3.9[111684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:52:45 compute-2 sudo[111682]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:52:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:46 compute-2 sudo[111835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfojkxelswfiltbqjbgpdpvrwppywzty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089966.051674-174-247435945457000/AnsiballZ_blockinfile.py'
Oct 10 09:52:46 compute-2 sudo[111835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:46.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:46 compute-2 python3.9[111837]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs576V3VvbSgv48Ml4JM3ripPY5VUVh8vdkDr1njjfd7J/WrQQkTf/D0b7+eGTXj3Y1fx1/haVrDafo7g0NqcSZX+zNUgTCnYPWafo7RMG4Q7ITVk1NPIkAC1cDUxHNeWhXaOkxCz96sTkO4aNW3uoFjsp2JkJtRJmHzT7q/bc0N9x7YcWh9vwRRBiOKlV8cWMHuHUzOlloEQLN67Dht1xHWr1eO/SITqUlWY13tc/54xQuo8nBQNNX9ArhMbJz2a9AoNVUAAYFF8hWFI5ES/GL9qsCp8dnmAtrY4Rc07QmHo1RkcjXe1f6D+vymRIP3YOqIjlWp0blCTfcCGno5lBa9f5JachIsogk+5+GYx4AAbWLyxxecfKzdCxrGnQlfFgldc1xDN1RG+8HwFEAuHQDWTCDUgF67FXSHy7aVxrdzU4046193/o3VKTpSaJmFldASxFgyUeujs56OgC0qYM0zKV4jOsMBcocVHvH/1FOPWIr81XXYvu6C/Ntd6sBj0=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGSf7pFS/S1SmUMk/yMobwR+LTaQZlAhBqo7Ido5r8dg
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB1l0EOuMseZ7ulHkfzzVtKv+5A9EWRy+oXVB+t370vohhJoN3+lviS8xoR8GttJUcHVCaeioniRtOWysbNdC0I=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUnwO+j5aInA4FKMx5pWF8B0Zp6L17GsYV5RBbu6iT67LtXjwbz5nP4EC7t80boMHnS7DRNCAxF0FNMVhQ9o4+1E1n2mrUxxAw8YxcZTabu/lAqRb4I6RzmXdXSA9mF8O3onswi/KhJg6YUTFEWCuxWrMLco15IatKi+hNqcRUk1DreR2L/YN0W5qXkvj1z3aoph1h3Yn1lRjuQDrVHp6lCywixC2pHwYG+CrPyX+0PkXJg+JRvRdxNCIw0D0zOkJrnppmT8XpIj42JLRUGGV592XFVXHiEhZdOI2bdzPy490EfIbWF9Symqi/V5vf8SK9LMOscHXkD7jsT6VKzsUXyk6/IzzZ2TzhD173lt8HpRJyaZq4ME0ZSVYNyD58DN/CQ3xpO1c1E8Wp4fUswc4WHmb/eILnY0lDXOZt6Hb/e+K6RHu5e5GOo0KSfei/LyrqJkBQn2P8UkbJvrUh2bNw+whjvT5CmXd3rPCw+Xq3/K3Gpit1K/4pC0zGC+CQr7E=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILklS4uW4IrGY5dWZTg4VeKVeFB3jPeUpu/8f4D1+rd5
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCelD2lLiMWT09YjxTI9IfdSnHfdMuHKAAEYFKZmJg34mgwUIDqUQqoc9I6a7Ps9pRizY+UpHWL//lD7hvvhD5k=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDarlOcgDXqRdSww3oIuqu7nGIBJToNGSnU1ljOr6GTlHTxxOoTztIrvZrPaJA8w/ixztkhFZZSdRPw4meYayY05CNu9SneiL62twzDLDsqeDPAspkh69Ljj5aGCLf6GJDiK0m2h1jLDIFtXH3lIQE9781zA7ZQ8+/xeF4yRS1/Fb5CXDG+oi/J0veCffs6t0TYmrUfSgS2H2y0UxNu7C6GoQKRde1arPLOYexvlg2RjlWM6Ex4JCqTAd9EN330Kh4HUr3r46ET8mwi1mPndibbiW0heXgrg8FeV5hBqOxQsGgLEKpX1cNAz6Rr0C5Hg1xfGcsJtep88vbJFmMyV1jNowDtJCYpprqa16Nj35HBuuz7zbzVlIdeQhEJ9I4I7eNhUxlb2/XYRXy2hfsrM9D2TP7B+bVPLjlqgqy8stBhGBCtH32ppNsXHE6uGPHMovcz2VhbP/P3sp9NQV+hF2Q0RbBXrQZkEI9YJdhxQw5hyOqwfPrEEBFy8FpzSKfBAW0=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC1nQuW/lbxVJxo9H20J7i0+Z6cHtufrF4VbA6zs724f
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0oTxSrAqx34tAubl7rouYPI7qhs6NhoDmGr3PTW1+mypEQw0EO+pZ99zSRnweC5RBoL080AgUKo7KN+v3LDHw=
                                              create=True mode=0644 path=/tmp/ansible.6f632o26 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:46 compute-2 sudo[111835]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:47.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:47 compute-2 ceph-mon[74913]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:47 compute-2 sudo[111988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glvlwgxzwsskyshxtbfxzvwxvslpizpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089966.9876318-198-146937601770675/AnsiballZ_command.py'
Oct 10 09:52:47 compute-2 sudo[111988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:47 compute-2 python3.9[111990]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.6f632o26' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:52:47 compute-2 sudo[111988]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:48 compute-2 sudo[112142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqprbtcfbchpqkbqqcnmiirqmjelwrki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089967.9368181-222-46023261810826/AnsiballZ_file.py'
Oct 10 09:52:48 compute-2 sudo[112142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:48 compute-2 python3.9[112144]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.6f632o26 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:52:48 compute-2 sudo[112142]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:48.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:48 compute-2 sshd-session[110789]: Connection closed by 192.168.122.30 port 56556
Oct 10 09:52:49 compute-2 sshd-session[110786]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:52:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:49 compute-2 systemd[1]: session-45.scope: Deactivated successfully.
Oct 10 09:52:49 compute-2 systemd[1]: session-45.scope: Consumed 5.251s CPU time.
Oct 10 09:52:49 compute-2 systemd-logind[796]: Session 45 logged out. Waiting for processes to exit.
Oct 10 09:52:49 compute-2 systemd-logind[796]: Removed session 45.
Oct 10 09:52:49 compute-2 ceph-mon[74913]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:52:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:50.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:52:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:52:51 compute-2 sudo[112174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:52:51 compute-2 sudo[112174]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:52:51 compute-2 sudo[112174]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:51 compute-2 ceph-mon[74913]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:52.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:53.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:53 compute-2 ceph-mon[74913]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:52:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:54 compute-2 sshd-session[112201]: Accepted publickey for zuul from 192.168.122.30 port 41174 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:52:54 compute-2 systemd-logind[796]: New session 46 of user zuul.
Oct 10 09:52:54 compute-2 systemd[1]: Started Session 46 of User zuul.
Oct 10 09:52:54 compute-2 sshd-session[112201]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:52:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:54.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:52:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:55.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:55 compute-2 python3.9[112356]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:52:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:55 compute-2 ceph-mon[74913]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:56 compute-2 sudo[112511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgojyzpjcrzwsvyvmxgptzkyebyjjrlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089975.5670938-58-34510107450762/AnsiballZ_systemd.py'
Oct 10 09:52:56 compute-2 sudo[112511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003150 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:56 compute-2 python3.9[112513]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 09:52:56 compute-2 sudo[112511]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:57.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:57 compute-2 sudo[112667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inzzcctdtppzgfcofpocxspgsgtawrqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089976.848435-82-222872015780810/AnsiballZ_systemd.py'
Oct 10 09:52:57 compute-2 sudo[112667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:57 compute-2 python3.9[112669]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 09:52:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:57 compute-2 sudo[112667]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:57 compute-2 ceph-mon[74913]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:52:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140004010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:58 compute-2 sudo[112820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piuvnuxhivfgrvvubxsvwhoxvjknwsqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089977.8280542-110-259708362168702/AnsiballZ_command.py'
Oct 10 09:52:58 compute-2 sudo[112820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:58 compute-2 python3.9[112822]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:52:58 compute-2 sudo[112820]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.567895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978567935, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2160, "num_deletes": 251, "total_data_size": 6238763, "memory_usage": 6311288, "flush_reason": "Manual Compaction"}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978584533, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2509325, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10754, "largest_seqno": 12909, "table_properties": {"data_size": 2503062, "index_size": 3206, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15654, "raw_average_key_size": 20, "raw_value_size": 2489405, "raw_average_value_size": 3195, "num_data_blocks": 143, "num_entries": 779, "num_filter_entries": 779, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089776, "oldest_key_time": 1760089776, "file_creation_time": 1760089978, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16675 microseconds, and 7914 cpu microseconds.
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.584573) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2509325 bytes OK
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.584591) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586159) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586173) EVENT_LOG_v1 {"time_micros": 1760089978586169, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586190) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6229284, prev total WAL file size 6229284, number of live WAL files 2.
Oct 10 09:52:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:52:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:58.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.588334) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2450KB)], [21(12MB)]
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978588373, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15935161, "oldest_snapshot_seqno": -1}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4433 keys, 14286695 bytes, temperature: kUnknown
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978690025, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14286695, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14252792, "index_size": 21697, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11141, "raw_key_size": 111751, "raw_average_key_size": 25, "raw_value_size": 14167774, "raw_average_value_size": 3195, "num_data_blocks": 932, "num_entries": 4433, "num_filter_entries": 4433, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089978, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.690227) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14286695 bytes
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.691683) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.7 rd, 140.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.8 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(12.0) write-amplify(5.7) OK, records in: 4856, records dropped: 423 output_compression: NoCompression
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.691701) EVENT_LOG_v1 {"time_micros": 1760089978691692, "job": 10, "event": "compaction_finished", "compaction_time_micros": 101710, "compaction_time_cpu_micros": 52067, "output_level": 6, "num_output_files": 1, "total_output_size": 14286695, "num_input_records": 4856, "num_output_records": 4433, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978692158, "job": 10, "event": "table_file_deletion", "file_number": 23}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978694190, "job": 10, "event": "table_file_deletion", "file_number": 21}
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.588251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:58 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:52:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:52:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:52:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:52:59 compute-2 sudo[112975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgpcyxeoeeabqajkmwxaofqccsoikbky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089978.8243883-133-114128036115143/AnsiballZ_stat.py'
Oct 10 09:52:59 compute-2 sudo[112975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:52:59 compute-2 python3.9[112977]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:52:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:59 compute-2 sudo[112975]: pam_unix(sudo:session): session closed for user root
Oct 10 09:52:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:52:59 compute-2 ceph-mon[74913]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:52:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:52:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:00 compute-2 sudo[113127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjsuiwrkdrjsdkbbuqsuerhrjokbajuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089979.7847085-160-30847082941637/AnsiballZ_file.py'
Oct 10 09:53:00 compute-2 sudo[113127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140004030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:00 compute-2 python3.9[113129]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:00 compute-2 sudo[113127]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:00.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:00 compute-2 sshd-session[112204]: Connection closed by 192.168.122.30 port 41174
Oct 10 09:53:00 compute-2 sshd-session[112201]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:53:00 compute-2 systemd[1]: session-46.scope: Deactivated successfully.
Oct 10 09:53:00 compute-2 systemd[1]: session-46.scope: Consumed 3.922s CPU time.
Oct 10 09:53:00 compute-2 systemd-logind[796]: Session 46 logged out. Waiting for processes to exit.
Oct 10 09:53:00 compute-2 systemd-logind[796]: Removed session 46.
Oct 10 09:53:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:01.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:01 compute-2 ceph-mon[74913]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:53:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:03.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:03 compute-2 ceph-mon[74913]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:53:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:04.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:05.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:05 compute-2 ceph-mon[74913]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:06 compute-2 sshd-session[113162]: Accepted publickey for zuul from 192.168.122.30 port 45546 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:53:06 compute-2 systemd-logind[796]: New session 47 of user zuul.
Oct 10 09:53:06 compute-2 systemd[1]: Started Session 47 of User zuul.
Oct 10 09:53:06 compute-2 sshd-session[113162]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:53:06 compute-2 sudo[113194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:53:06 compute-2 sudo[113194]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:06 compute-2 sudo[113194]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:06 compute-2 sudo[113244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:53:06 compute-2 sudo[113244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:07.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095307 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:53:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:07 compute-2 sudo[113244]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:07 compute-2 ceph-mon[74913]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:53:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:53:07 compute-2 python3.9[113385]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:53:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:08 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:53:08 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:53:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:08 compute-2 sudo[113553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzjqqvuyxmbnmkkyoehkwgivbozmjip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089988.3088598-64-115348874535317/AnsiballZ_setup.py'
Oct 10 09:53:08 compute-2 sudo[113553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:08 compute-2 python3.9[113555]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:53:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:09.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:09 compute-2 sudo[113553]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:09 compute-2 sudo[113638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxmfdvgcxizrnyrocvaurlqtjvxfmvhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760089988.3088598-64-115348874535317/AnsiballZ_dnf.py'
Oct 10 09:53:09 compute-2 sudo[113638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:09 compute-2 ceph-mon[74913]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:53:09 compute-2 python3.9[113640]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 09:53:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:10.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:10 compute-2 sudo[113638]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:11.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:11 compute-2 sudo[113720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:53:11 compute-2 sudo[113720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:11 compute-2 sudo[113720]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:11 compute-2 ceph-mon[74913]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:11 compute-2 python3.9[113818]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:53:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:12.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:13 compute-2 sudo[113955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:53:13 compute-2 sudo[113955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:13 compute-2 sudo[113955]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:13 compute-2 python3.9[113984]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 09:53:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:13 compute-2 ceph-mon[74913]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:53:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:53:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:14 compute-2 python3.9[114146]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:53:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:14.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.968147) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994968207, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 429, "num_deletes": 251, "total_data_size": 564009, "memory_usage": 572776, "flush_reason": "Manual Compaction"}
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994972445, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 372836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12914, "largest_seqno": 13338, "table_properties": {"data_size": 370373, "index_size": 563, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5935, "raw_average_key_size": 18, "raw_value_size": 365412, "raw_average_value_size": 1131, "num_data_blocks": 24, "num_entries": 323, "num_filter_entries": 323, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089979, "oldest_key_time": 1760089979, "file_creation_time": 1760089994, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 4325 microseconds, and 2104 cpu microseconds.
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.972481) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 372836 bytes OK
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.972499) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973744) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973757) EVENT_LOG_v1 {"time_micros": 1760089994973753, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973775) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 561283, prev total WAL file size 561283, number of live WAL files 2.
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.974352) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(364KB)], [24(13MB)]
Oct 10 09:53:14 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994974416, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 14659531, "oldest_snapshot_seqno": -1}
Oct 10 09:53:14 compute-2 ceph-mon[74913]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:53:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:15.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4241 keys, 12701429 bytes, temperature: kUnknown
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995042009, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12701429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12670525, "index_size": 19210, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108700, "raw_average_key_size": 25, "raw_value_size": 12590509, "raw_average_value_size": 2968, "num_data_blocks": 813, "num_entries": 4241, "num_filter_entries": 4241, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089994, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.042296) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12701429 bytes
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.043523) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 216.6 rd, 187.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.6 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(73.4) write-amplify(34.1) OK, records in: 4756, records dropped: 515 output_compression: NoCompression
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.043549) EVENT_LOG_v1 {"time_micros": 1760089995043537, "job": 12, "event": "compaction_finished", "compaction_time_micros": 67668, "compaction_time_cpu_micros": 33043, "output_level": 6, "num_output_files": 1, "total_output_size": 12701429, "num_input_records": 4756, "num_output_records": 4241, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995043752, "job": 12, "event": "table_file_deletion", "file_number": 26}
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995045987, "job": 12, "event": "table_file_deletion", "file_number": 24}
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.974213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 09:53:15 compute-2 python3.9[114298]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:53:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:15 compute-2 sshd-session[113165]: Connection closed by 192.168.122.30 port 45546
Oct 10 09:53:15 compute-2 sshd-session[113162]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:53:15 compute-2 systemd[1]: session-47.scope: Deactivated successfully.
Oct 10 09:53:15 compute-2 systemd[1]: session-47.scope: Consumed 6.018s CPU time.
Oct 10 09:53:15 compute-2 systemd-logind[796]: Session 47 logged out. Waiting for processes to exit.
Oct 10 09:53:15 compute-2 systemd-logind[796]: Removed session 47.
Oct 10 09:53:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:53:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:53:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:16.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:17.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:17 compute-2 ceph-mon[74913]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:53:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:53:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:53:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:53:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:19.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:53:19 compute-2 ceph-mon[74913]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:53:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:20 compute-2 sshd-session[114327]: Accepted publickey for zuul from 192.168.122.30 port 35080 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:53:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:20 compute-2 systemd-logind[796]: New session 48 of user zuul.
Oct 10 09:53:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:20 compute-2 systemd[1]: Started Session 48 of User zuul.
Oct 10 09:53:20 compute-2 sshd-session[114327]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:53:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:20.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:21.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:21 compute-2 ceph-mon[74913]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:53:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:21 compute-2 python3.9[114482]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:53:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:53:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:22.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:23.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:23 compute-2 sudo[114638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxuzwzasyqqfmtfypwitortiqidxaip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090002.7640948-112-66302511949250/AnsiballZ_file.py'
Oct 10 09:53:23 compute-2 sudo[114638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:23 compute-2 python3.9[114640]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:23 compute-2 sudo[114638]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:23 compute-2 ceph-mon[74913]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:53:23 compute-2 sudo[114790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqcsmwynkwxhrfyalpndmqmncnoejchr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090003.5930042-112-43581865229272/AnsiballZ_file.py'
Oct 10 09:53:23 compute-2 sudo[114790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:24 compute-2 python3.9[114792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:24 compute-2 sudo[114790]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:24.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:24 compute-2 sudo[114944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujsykondjzhjmioyckyysadovoimewih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090004.306494-155-188169185077042/AnsiballZ_stat.py'
Oct 10 09:53:24 compute-2 sudo[114944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:24 compute-2 python3.9[114946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:24 compute-2 sudo[114944]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:25.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:25 compute-2 ceph-mon[74913]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:53:25 compute-2 sudo[115067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzhuotndxgqadcuerewcfdupnbfyktra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090004.306494-155-188169185077042/AnsiballZ_copy.py'
Oct 10 09:53:25 compute-2 sudo[115067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:25 compute-2 python3.9[115069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090004.306494-155-188169185077042/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=82e52d2e0222fcf71d7bc250104afff621190352 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:25 compute-2 sudo[115067]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:26 compute-2 sudo[115219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqfhydkgdnkbkcwurysbntlkumbhsqlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090005.8433385-155-16603967517101/AnsiballZ_stat.py'
Oct 10 09:53:26 compute-2 sudo[115219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:26 compute-2 python3.9[115221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:26 compute-2 sudo[115219]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:26.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:26 compute-2 sudo[115343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wycgmczilzemiazyekuagaubwhojtude ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090005.8433385-155-16603967517101/AnsiballZ_copy.py'
Oct 10 09:53:26 compute-2 sudo[115343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:26 compute-2 python3.9[115345]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090005.8433385-155-16603967517101/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=6d432417c0c3c485924638569c72973f4b3272fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:26 compute-2 sudo[115343]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:27.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095327 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:53:27 compute-2 sudo[115496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-motnnagqvzofikjtlhmrrbvxylrooqxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090006.977299-155-271065487793152/AnsiballZ_stat.py'
Oct 10 09:53:27 compute-2 sudo[115496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:27 compute-2 ceph-mon[74913]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:53:27 compute-2 python3.9[115498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:27 compute-2 sudo[115496]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:27 compute-2 sudo[115620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtcymuzlkqgeylmjfdirscryfugttqlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090006.977299-155-271065487793152/AnsiballZ_copy.py'
Oct 10 09:53:27 compute-2 sudo[115620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:28 compute-2 python3.9[115622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090006.977299-155-271065487793152/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=22dd871d21e0e7808e7ed0de3e38963760611c24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:28 compute-2 sudo[115620]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:28.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:28 compute-2 sudo[115774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiwfvybqgimifenhnkunkhbqqbdalcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090008.4234416-278-206766970092252/AnsiballZ_file.py'
Oct 10 09:53:28 compute-2 sudo[115774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:29 compute-2 python3.9[115776]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:29 compute-2 sudo[115774]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:29.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:29 compute-2 sudo[115926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhlooswmyxjxroolzldhiattvnvylvhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090009.1611176-278-68721225347113/AnsiballZ_file.py'
Oct 10 09:53:29 compute-2 sudo[115926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:29 compute-2 ceph-mon[74913]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:53:29 compute-2 python3.9[115928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:29 compute-2 sudo[115926]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:30 compute-2 sudo[116078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfykkupwjoiatpgyuiwjztbttrcqxtdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090009.7904294-321-128687080074166/AnsiballZ_stat.py'
Oct 10 09:53:30 compute-2 sudo[116078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:30 compute-2 python3.9[116080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:30 compute-2 sudo[116078]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:30 compute-2 sudo[116202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsfrdnbfuwxwwcjuwsxkynicyanrrugz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090009.7904294-321-128687080074166/AnsiballZ_copy.py'
Oct 10 09:53:30 compute-2 sudo[116202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:30.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:30 compute-2 python3.9[116204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090009.7904294-321-128687080074166/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=dd9d6a73b1e231095db4a2bfe6482df0f3a33661 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:30 compute-2 sudo[116202]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:31.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:31 compute-2 sudo[116355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bykvkggmdtgicqyhzzvsbwwrtdvveqgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090011.0296671-321-97806507666249/AnsiballZ_stat.py'
Oct 10 09:53:31 compute-2 sudo[116355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:31 compute-2 python3.9[116357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:31 compute-2 sudo[116355]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:31 compute-2 ceph-mon[74913]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:53:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:53:31 compute-2 sudo[116358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:53:31 compute-2 sudo[116358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:31 compute-2 sudo[116358]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:31 compute-2 sudo[116503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrbimzixzwbjpqbrltatklooitclxpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090011.0296671-321-97806507666249/AnsiballZ_copy.py'
Oct 10 09:53:31 compute-2 sudo[116503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:32 compute-2 python3.9[116505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090011.0296671-321-97806507666249/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=abcc61006dfeb8ab87ea24afb3b53290e7b990dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:32 compute-2 sudo[116503]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:32 compute-2 sudo[116656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfhpqcpnfisucnoelnyasbrqxbfboejz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090012.2780507-321-276740763128526/AnsiballZ_stat.py'
Oct 10 09:53:32 compute-2 sudo[116656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:32.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:32 compute-2 python3.9[116658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:32 compute-2 sudo[116656]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:33 compute-2 sudo[116780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vizjsspezmuxzsgkhsqmffdyuwzuvvoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090012.2780507-321-276740763128526/AnsiballZ_copy.py'
Oct 10 09:53:33 compute-2 sudo[116780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:33.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:33 compute-2 python3.9[116782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090012.2780507-321-276740763128526/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=56bf5fb5f6d0ebd1ad6e0802c492f9ea9fbe1bf5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:33 compute-2 sudo[116780]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:33 compute-2 ceph-mon[74913]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 09:53:33 compute-2 sudo[116932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acscgssojneehdiauvhithxpgctcsukk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090013.4630926-438-235733708674164/AnsiballZ_file.py'
Oct 10 09:53:33 compute-2 sudo[116932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:33 compute-2 python3.9[116934]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:33 compute-2 sudo[116932]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:34 compute-2 sudo[117085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgeieamazfpomozahoavcvrtxcklssln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090014.0537157-438-149836196482513/AnsiballZ_file.py'
Oct 10 09:53:34 compute-2 sudo[117085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:34 compute-2 python3.9[117087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:34 compute-2 sudo[117085]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:34.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 09:53:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:35.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 09:53:35 compute-2 sudo[117239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpwmxshdvgdnbexlpoqgmxarenqqckek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090014.8390055-480-278969805050227/AnsiballZ_stat.py'
Oct 10 09:53:35 compute-2 sudo[117239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:35 compute-2 python3.9[117241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:35 compute-2 sudo[117239]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:35 compute-2 ceph-mon[74913]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:53:35 compute-2 sudo[117362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsajzrsxllxowaxurlfohqngmyiroakx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090014.8390055-480-278969805050227/AnsiballZ_copy.py'
Oct 10 09:53:35 compute-2 sudo[117362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:35 compute-2 python3.9[117364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090014.8390055-480-278969805050227/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=7b7b2ca77b92e88bec61aff3421984fcd2e9a026 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:35 compute-2 sudo[117362]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:36 compute-2 sudo[117514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvrbvminvhfapgmfmtbfgjdsycpxmgao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090015.9769478-480-279426433375276/AnsiballZ_stat.py'
Oct 10 09:53:36 compute-2 sudo[117514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:36 compute-2 python3.9[117516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:36 compute-2 sudo[117514]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:36.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:36 compute-2 sudo[117639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojkkugzczrhofyvhcszlilprjeovyalv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090015.9769478-480-279426433375276/AnsiballZ_copy.py'
Oct 10 09:53:36 compute-2 sudo[117639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:36 compute-2 python3.9[117642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090015.9769478-480-279426433375276/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=abcc61006dfeb8ab87ea24afb3b53290e7b990dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:36 compute-2 sudo[117639]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:37.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:37 compute-2 sudo[117792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwqjdwbqmwmqtsnwnebzhtkrlacztnse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090017.1071286-480-49894045491253/AnsiballZ_stat.py'
Oct 10 09:53:37 compute-2 sudo[117792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:37 compute-2 python3.9[117794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:37 compute-2 sudo[117792]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:37 compute-2 ceph-mon[74913]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:53:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:37 compute-2 sudo[117915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkbzvpysxcglaukfecldfwoygfpjhfgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090017.1071286-480-49894045491253/AnsiballZ_copy.py'
Oct 10 09:53:37 compute-2 sudo[117915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:38 compute-2 python3.9[117917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090017.1071286-480-49894045491253/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=4b3eb023242fa3e834b8d259dc59353292772111 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:38 compute-2 sudo[117915]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:38.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:39 compute-2 sudo[118069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihgivjogfgluoxztzorzhjckozppnzqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090018.7801933-623-20121912048758/AnsiballZ_file.py'
Oct 10 09:53:39 compute-2 sudo[118069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:39.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:39 compute-2 python3.9[118071]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:39 compute-2 sudo[118069]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:39 compute-2 ceph-mon[74913]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:53:39 compute-2 sudo[118221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icdeclnfzmwnvkdoblfdqmwurcusprtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090019.4338968-646-160770295232989/AnsiballZ_stat.py'
Oct 10 09:53:39 compute-2 sudo[118221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:39 compute-2 python3.9[118223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:39 compute-2 sudo[118221]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:40 compute-2 sudo[118344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fucahukvdugxmblzokqwqmhhehbvqeal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090019.4338968-646-160770295232989/AnsiballZ_copy.py'
Oct 10 09:53:40 compute-2 sudo[118344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:40 compute-2 python3.9[118346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090019.4338968-646-160770295232989/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:40 compute-2 sudo[118344]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:40 compute-2 sudo[118498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gczryzdvtqzhmqhlipitiermbhbkynfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090020.6140485-695-32907440814221/AnsiballZ_file.py'
Oct 10 09:53:40 compute-2 sudo[118498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:41.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:41 compute-2 python3.9[118500]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:41 compute-2 sudo[118498]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:41 compute-2 sudo[118650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cujnkgbvyvrkasgwmccazuljskteovdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090021.3195982-719-21212220630772/AnsiballZ_stat.py'
Oct 10 09:53:41 compute-2 sudo[118650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:41 compute-2 ceph-mon[74913]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:41 compute-2 python3.9[118652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:41 compute-2 sudo[118650]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:42 compute-2 sudo[118773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvdpbyrzhliwwguayoueibiqlicdihgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090021.3195982-719-21212220630772/AnsiballZ_copy.py'
Oct 10 09:53:42 compute-2 sudo[118773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:42 compute-2 python3.9[118775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090021.3195982-719-21212220630772/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:42 compute-2 sudo[118773]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:42.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:42 compute-2 sudo[118926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taubpxbtzlmkftdloxzpflxuxmyxzvhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090022.5191982-775-44046362375719/AnsiballZ_file.py'
Oct 10 09:53:42 compute-2 sudo[118926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:42 compute-2 python3.9[118928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:42 compute-2 sudo[118926]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:43.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:43 compute-2 sudo[119079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muiqrepthtjrikjgatgjdlcvhhehwlur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090023.1106236-797-178952081481129/AnsiballZ_stat.py'
Oct 10 09:53:43 compute-2 sudo[119079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:43 compute-2 python3.9[119081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:43 compute-2 sudo[119079]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:43 compute-2 ceph-mon[74913]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:53:43 compute-2 sudo[119202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvhpfzumgypchksxxkysdeznvhzngoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090023.1106236-797-178952081481129/AnsiballZ_copy.py'
Oct 10 09:53:43 compute-2 sudo[119202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:44 compute-2 python3.9[119204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090023.1106236-797-178952081481129/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:44 compute-2 sudo[119202]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:44.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:44 compute-2 sudo[119355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heapzwoimenyxsnbdlvsjxtltqiawrar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090024.3607368-837-158307161420706/AnsiballZ_file.py'
Oct 10 09:53:44 compute-2 sudo[119355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:44 compute-2 python3.9[119357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:44 compute-2 sudo[119355]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:45.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:45 compute-2 sudo[119508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aimkyfijtrsdrwfdsoavetcbymftkruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090025.1443326-862-52910869475216/AnsiballZ_stat.py'
Oct 10 09:53:45 compute-2 sudo[119508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:45 compute-2 python3.9[119510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:45 compute-2 ceph-mon[74913]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:45 compute-2 sudo[119508]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:46 compute-2 sudo[119631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaguskuhepzodmgrhqhazlydjwmmbwla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090025.1443326-862-52910869475216/AnsiballZ_copy.py'
Oct 10 09:53:46 compute-2 sudo[119631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:46 compute-2 python3.9[119633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090025.1443326-862-52910869475216/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:46 compute-2 sudo[119631]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:46.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:53:46 compute-2 sudo[119784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twiyocuhkvhcmivonodhkabzcvbrgozm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090026.4485624-903-172296806019570/AnsiballZ_file.py'
Oct 10 09:53:46 compute-2 sudo[119784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:46 compute-2 python3.9[119786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:46 compute-2 sudo[119784]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 09:53:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:47.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 09:53:47 compute-2 sudo[119937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-selpdzsdnkfapozvoasqubdyxfxomutm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090027.0635226-927-179103326546534/AnsiballZ_stat.py'
Oct 10 09:53:47 compute-2 sudo[119937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:47 compute-2 python3.9[119939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:47 compute-2 sudo[119937]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:47 compute-2 ceph-mon[74913]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:47 compute-2 sudo[120060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgqxkipbypqvyaxiuzjakqlaqxbkeazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090027.0635226-927-179103326546534/AnsiballZ_copy.py'
Oct 10 09:53:47 compute-2 sudo[120060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:48 compute-2 python3.9[120062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090027.0635226-927-179103326546534/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:48 compute-2 sudo[120060]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:48 compute-2 sudo[120213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iewebmhtkxzbrdddqwsrrnwdremnfztm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090028.2980685-972-87105427801823/AnsiballZ_file.py'
Oct 10 09:53:48 compute-2 sudo[120213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:48.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:48 compute-2 python3.9[120215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:53:48 compute-2 sudo[120213]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:49.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:49 compute-2 sudo[120366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkwyhmknpfgtsiihgwqdymvyifolumop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090028.8873005-993-86912624521205/AnsiballZ_stat.py'
Oct 10 09:53:49 compute-2 sudo[120366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:49 compute-2 python3.9[120368]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:53:49 compute-2 sudo[120366]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:49 compute-2 ceph-mon[74913]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:53:49 compute-2 sudo[120489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awxxrmplyxppvaqvtcfwujypldtywwha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090028.8873005-993-86912624521205/AnsiballZ_copy.py'
Oct 10 09:53:49 compute-2 sudo[120489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:53:49 compute-2 python3.9[120491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090028.8873005-993-86912624521205/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:53:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:49 compute-2 sudo[120489]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:50.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 09:53:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:51.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 09:53:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:51 compute-2 sudo[120518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:53:51 compute-2 sudo[120518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:53:51 compute-2 sudo[120518]: pam_unix(sudo:session): session closed for user root
Oct 10 09:53:51 compute-2 ceph-mon[74913]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:52.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:53.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:53 compute-2 ceph-mon[74913]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:53:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:54.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:55.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:55 compute-2 ceph-mon[74913]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:56 compute-2 sshd-session[114330]: Connection closed by 192.168.122.30 port 35080
Oct 10 09:53:56 compute-2 sshd-session[114327]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:53:56 compute-2 systemd[1]: session-48.scope: Deactivated successfully.
Oct 10 09:53:56 compute-2 systemd[1]: session-48.scope: Consumed 22.425s CPU time.
Oct 10 09:53:56 compute-2 systemd-logind[796]: Session 48 logged out. Waiting for processes to exit.
Oct 10 09:53:56 compute-2 systemd-logind[796]: Removed session 48.
Oct 10 09:53:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:56.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:53:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:57.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:53:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:57 compute-2 ceph-mon[74913]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:53:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:53:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 09:53:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:58.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 09:53:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:53:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:53:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:59.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:53:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:53:59 compute-2 ceph-mon[74913]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:53:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:53:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:00.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:01.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:01 compute-2 ceph-mon[74913]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:54:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:54:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:02 compute-2 sshd-session[120553]: Accepted publickey for zuul from 192.168.122.30 port 57782 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:54:02 compute-2 systemd-logind[796]: New session 49 of user zuul.
Oct 10 09:54:02 compute-2 systemd[1]: Started Session 49 of User zuul.
Oct 10 09:54:02 compute-2 sshd-session[120553]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:54:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:02.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:02 compute-2 sudo[120708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihlhdihylmsyunjlhgkyguxzywicyqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090042.1647766-28-273261950186309/AnsiballZ_file.py'
Oct 10 09:54:02 compute-2 sudo[120708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:03 compute-2 python3.9[120710]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:03 compute-2 sudo[120708]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:03.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:03 compute-2 sudo[120860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjtdzdlkyhgocqoeikpixzaymhgusaiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090043.2944539-64-172248528122302/AnsiballZ_stat.py'
Oct 10 09:54:03 compute-2 sudo[120860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:03 compute-2 ceph-mon[74913]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:54:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:03 compute-2 python3.9[120862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:03 compute-2 sudo[120860]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:04 compute-2 sudo[120984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxqumbkthtlarcgxjufxeobhnhowalhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090043.2944539-64-172248528122302/AnsiballZ_copy.py'
Oct 10 09:54:04 compute-2 sudo[120984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:04.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:04 compute-2 python3.9[120986]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090043.2944539-64-172248528122302/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=f4f20d3bcbb08befb7837fd0e595f186c33a7cc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:04 compute-2 sudo[120984]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:05.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:05 compute-2 sudo[121137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adnvdjoamxtthvnhscpbyirzbdwcxilx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090044.921748-64-83293049704282/AnsiballZ_stat.py'
Oct 10 09:54:05 compute-2 sudo[121137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:05 compute-2 python3.9[121139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:05 compute-2 sudo[121137]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:05 compute-2 sudo[121260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tebafovtrlwoxrcsqetgvgcamnggaleq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090044.921748-64-83293049704282/AnsiballZ_copy.py'
Oct 10 09:54:05 compute-2 sudo[121260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:05 compute-2 python3.9[121262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090044.921748-64-83293049704282/.source.conf _original_basename=ceph.conf follow=False checksum=1a4b9adde8f120db415fb0ad56382b109e0fedc1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:05 compute-2 ceph-mon[74913]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:54:05 compute-2 sudo[121260]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:06.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:06 compute-2 sshd-session[120556]: Connection closed by 192.168.122.30 port 57782
Oct 10 09:54:06 compute-2 sshd-session[120553]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:54:06 compute-2 systemd[1]: session-49.scope: Deactivated successfully.
Oct 10 09:54:06 compute-2 systemd[1]: session-49.scope: Consumed 2.659s CPU time.
Oct 10 09:54:06 compute-2 systemd-logind[796]: Session 49 logged out. Waiting for processes to exit.
Oct 10 09:54:06 compute-2 systemd-logind[796]: Removed session 49.
Oct 10 09:54:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:07.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:07 compute-2 ceph-mon[74913]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:54:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:08.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 09:54:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:09.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 09:54:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:09 compute-2 ceph-mon[74913]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:54:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:10 compute-2 kernel: ganesha.nfsd[105961]: segfault at 50 ip 00007ff21737e32e sp 00007ff1cf7fd210 error 4 in libntirpc.so.5.8[7ff217363000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 09:54:10 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 09:54:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046f0 fd 39 proxy ignored for local
Oct 10 09:54:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:10 compute-2 systemd[1]: Started Process Core Dump (PID 121293/UID 0).
Oct 10 09:54:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:11 compute-2 systemd-coredump[121294]: Process 98301 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 58:
                                                    #0  0x00007ff21737e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 09:54:11 compute-2 systemd[1]: systemd-coredump@2-121293-0.service: Deactivated successfully.
Oct 10 09:54:11 compute-2 systemd[1]: systemd-coredump@2-121293-0.service: Consumed 1.140s CPU time.
Oct 10 09:54:11 compute-2 podman[121301]: 2025-10-10 09:54:11.73984774 +0000 UTC m=+0.030648809 container died ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Oct 10 09:54:11 compute-2 systemd[1]: var-lib-containers-storage-overlay-93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb-merged.mount: Deactivated successfully.
Oct 10 09:54:11 compute-2 sudo[121311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:54:11 compute-2 sudo[121311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:11 compute-2 sudo[121311]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:11 compute-2 podman[121301]: 2025-10-10 09:54:11.789780693 +0000 UTC m=+0.080581742 container remove ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:54:11 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 09:54:11 compute-2 ceph-mon[74913]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:54:11 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 09:54:11 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.791s CPU time.
Oct 10 09:54:11 compute-2 sshd-session[121366]: Accepted publickey for zuul from 192.168.122.30 port 33150 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:54:11 compute-2 systemd-logind[796]: New session 50 of user zuul.
Oct 10 09:54:12 compute-2 systemd[1]: Started Session 50 of User zuul.
Oct 10 09:54:12 compute-2 sshd-session[121366]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:54:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:12.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:13 compute-2 python3.9[121523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:54:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:13.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:13 compute-2 sudo[121528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:54:13 compute-2 sudo[121528]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:13 compute-2 sudo[121528]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:13 compute-2 sudo[121557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:54:13 compute-2 sudo[121557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:13 compute-2 sudo[121557]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:13 compute-2 ceph-mon[74913]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:54:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:14 compute-2 sudo[121757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwzxltejdzevlnbihzzlxwbfhjzntyxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090053.8189046-64-248914331464782/AnsiballZ_file.py'
Oct 10 09:54:14 compute-2 sudo[121757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:14 compute-2 python3.9[121759]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:54:14 compute-2 sudo[121757]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:14.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:14 compute-2 sudo[121911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-norxdfevlqjxjgiuxentioqnvkyrxhbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090054.641242-64-133309592351654/AnsiballZ_file.py'
Oct 10 09:54:14 compute-2 sudo[121911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:54:14 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:54:15 compute-2 python3.9[121913]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:54:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:15.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:15 compute-2 sudo[121911]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:15 compute-2 ceph-mon[74913]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:54:16 compute-2 python3.9[122063]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:54:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:54:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:16.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:54:16 compute-2 ceph-mon[74913]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:54:17 compute-2 sudo[122215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sibftieydytzqnxnwvhzevotjncxgtlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090056.360444-133-144109398107494/AnsiballZ_seboolean.py'
Oct 10 09:54:17 compute-2 sudo[122215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:17.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:17 compute-2 python3.9[122217]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 10 09:54:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:18 compute-2 sudo[122215]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:18.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:19.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:19 compute-2 ceph-mon[74913]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:54:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:19 compute-2 sudo[122374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rehdhcklozuuesrbabltgjwnfnpbdadh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090059.3525534-163-5378392572625/AnsiballZ_setup.py'
Oct 10 09:54:19 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 10 09:54:19 compute-2 sudo[122374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:19 compute-2 python3.9[122376]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:54:20 compute-2 sudo[122381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:54:20 compute-2 sudo[122381]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:20 compute-2 sudo[122381]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:20 compute-2 sudo[122374]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:20 compute-2 sudo[122484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eohhtctlpuigwbedbdhyqcdiurzizktv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090059.3525534-163-5378392572625/AnsiballZ_dnf.py'
Oct 10 09:54:20 compute-2 sudo[122484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:20.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:20 compute-2 python3.9[122486]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:54:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:54:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:21 compute-2 ceph-mon[74913]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:54:21 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 3.
Oct 10 09:54:21 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:54:21 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.791s CPU time.
Oct 10 09:54:21 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:54:22 compute-2 sudo[122484]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:22 compute-2 podman[122559]: 2025-10-10 09:54:22.230206472 +0000 UTC m=+0.043005205 container create 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Oct 10 09:54:22 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:54:22 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:54:22 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:54:22 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:54:22 compute-2 podman[122559]: 2025-10-10 09:54:22.28729108 +0000 UTC m=+0.100089843 container init 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 09:54:22 compute-2 podman[122559]: 2025-10-10 09:54:22.292532832 +0000 UTC m=+0.105331565 container start 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Oct 10 09:54:22 compute-2 bash[122559]: 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387
Oct 10 09:54:22 compute-2 podman[122559]: 2025-10-10 09:54:22.212836402 +0000 UTC m=+0.025635155 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:54:22 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:22.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:23 compute-2 sudo[122744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdabghnkowvsihquwiaqxgtnddtrbon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090062.4498994-199-8730486919854/AnsiballZ_systemd.py'
Oct 10 09:54:23 compute-2 sudo[122744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:23.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:23 compute-2 python3.9[122746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:54:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:23 compute-2 sudo[122744]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:23 compute-2 ceph-mon[74913]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:54:24 compute-2 sudo[122899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfzmybweqadtgaxdbaizfuvkhfnskgdf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090063.723122-223-145271706091269/AnsiballZ_edpm_nftables_snippet.py'
Oct 10 09:54:24 compute-2 sudo[122899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:24 compute-2 python3[122901]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 10 09:54:24 compute-2 sudo[122899]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:24.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:25 compute-2 sudo[123053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jazohpbmclnkvvgrldwgysohmibnfhyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090064.772407-251-242745175821345/AnsiballZ_file.py'
Oct 10 09:54:25 compute-2 sudo[123053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:25.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:25 compute-2 python3.9[123055]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:25 compute-2 sudo[123053]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:25 compute-2 ceph-mon[74913]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Oct 10 09:54:26 compute-2 sudo[123205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgnnhfbiobhzyufqjpdypcnxrmnqayhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090065.5791683-274-172270702885169/AnsiballZ_stat.py'
Oct 10 09:54:26 compute-2 sudo[123205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:26 compute-2 python3.9[123207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:26 compute-2 sudo[123205]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:26 compute-2 sudo[123284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfmxpscpbncbcnlcmppskvoqhmofkwsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090065.5791683-274-172270702885169/AnsiballZ_file.py'
Oct 10 09:54:26 compute-2 sudo[123284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:26.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:26 compute-2 python3.9[123286]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:26 compute-2 sudo[123284]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:27.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:27 compute-2 sudo[123437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkmeesaxrrkzvrjxzaecammbhrvycsgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090067.2199855-310-76806213122720/AnsiballZ_stat.py'
Oct 10 09:54:27 compute-2 sudo[123437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:27 compute-2 python3.9[123439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:27 compute-2 sudo[123437]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095427 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:54:27 compute-2 sudo[123515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdqxwpotxzulegmafypwwgnqqodyhbfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090067.2199855-310-76806213122720/AnsiballZ_file.py'
Oct 10 09:54:27 compute-2 sudo[123515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:27 compute-2 ceph-mon[74913]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Oct 10 09:54:28 compute-2 python3.9[123517]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tczzimlo recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:28 compute-2 sudo[123515]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:54:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:54:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 09:54:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:28.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:28 compute-2 sudo[123669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgwagygntywidrqyscddkdghmbsbwcgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090068.5051236-347-144289191187187/AnsiballZ_stat.py'
Oct 10 09:54:28 compute-2 sudo[123669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:28 compute-2 python3.9[123671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:29 compute-2 sudo[123669]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:29.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:29 compute-2 sudo[123747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiwvypbtwqxrrnvhbtfqschfiqhapemz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090068.5051236-347-144289191187187/AnsiballZ_file.py'
Oct 10 09:54:29 compute-2 sudo[123747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:29 compute-2 python3.9[123749]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:29 compute-2 sudo[123747]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:29 compute-2 ceph-mon[74913]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Oct 10 09:54:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:30 compute-2 sudo[123900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyolbmwgoeuizfkbsxxbzcyfysoeyxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090069.8454049-386-121923889855807/AnsiballZ_command.py'
Oct 10 09:54:30 compute-2 sudo[123900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:30.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:30 compute-2 python3.9[123902]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:30 compute-2 sudo[123900]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:31 compute-2 ceph-mon[74913]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Oct 10 09:54:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:54:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:54:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:31 compute-2 sudo[124054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgfonhektpvgctavoiwmormloqwunifr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090071.1199882-409-104277967111302/AnsiballZ_edpm_nftables_from_files.py'
Oct 10 09:54:31 compute-2 sudo[124054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:31 compute-2 python3[124056]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 09:54:31 compute-2 sudo[124054]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:31 compute-2 sudo[124057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:54:31 compute-2 sudo[124057]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:31 compute-2 sudo[124057]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:54:32 compute-2 sudo[124231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zozaxscqvdswrknuxqvbttvzxyvxxiyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090072.1083715-433-70134766167542/AnsiballZ_stat.py'
Oct 10 09:54:32 compute-2 sudo[124231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:32 compute-2 python3.9[124233]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:32 compute-2 sudo[124231]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 09:54:33 compute-2 ceph-mon[74913]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:54:33 compute-2 sudo[124358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtqashlafbzutyjcrtomimacsyjdezwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090072.1083715-433-70134766167542/AnsiballZ_copy.py'
Oct 10 09:54:33 compute-2 sudo[124358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:33.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:33 compute-2 python3.9[124360]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090072.1083715-433-70134766167542/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:33 compute-2 sudo[124358]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:54:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:54:33 compute-2 sudo[124510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgerynsdjmxwebxdjrkkfhpnwmfsznrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090073.6032171-478-245081298478511/AnsiballZ_stat.py'
Oct 10 09:54:33 compute-2 sudo[124510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:34 compute-2 python3.9[124512]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:34 compute-2 sudo[124510]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:34 compute-2 sudo[124636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmcenbdencyhrritaikdjrkzflbnqopi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090073.6032171-478-245081298478511/AnsiballZ_copy.py'
Oct 10 09:54:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:34 compute-2 sudo[124636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:34 compute-2 python3.9[124638]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090073.6032171-478-245081298478511/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:34 compute-2 sudo[124636]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:35.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:35 compute-2 ceph-mon[74913]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:54:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:35 compute-2 sudo[124789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjmxyemgdefjaualnjahlahuvylnquey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090075.2725697-524-125248784544233/AnsiballZ_stat.py'
Oct 10 09:54:35 compute-2 sudo[124789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:35 compute-2 python3.9[124791]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:35 compute-2 sudo[124789]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:36 compute-2 sudo[124914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdidenrleixpvvdubuqsvsxpwvsmdbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090075.2725697-524-125248784544233/AnsiballZ_copy.py'
Oct 10 09:54:36 compute-2 sudo[124914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:36 compute-2 python3.9[124916]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090075.2725697-524-125248784544233/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:36 compute-2 sudo[124914]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:37 compute-2 sudo[125068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efthdvhrdfttesslicgzkimkygdmqwzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090076.7601483-568-98735656420959/AnsiballZ_stat.py'
Oct 10 09:54:37 compute-2 sudo[125068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:37.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:37 compute-2 python3.9[125070]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:37 compute-2 sudo[125068]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:37 compute-2 ceph-mon[74913]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:54:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:37 compute-2 sudo[125193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jskphqcfqogczwbydeenmdmiaxddohvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090076.7601483-568-98735656420959/AnsiballZ_copy.py'
Oct 10 09:54:37 compute-2 sudo[125193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:37 compute-2 python3.9[125195]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090076.7601483-568-98735656420959/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:37 compute-2 sudo[125193]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:54:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:38.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:54:38 compute-2 sudo[125347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spgsurwaaauazzumfyjeowybpqeilsab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090078.411788-613-161270723694536/AnsiballZ_stat.py'
Oct 10 09:54:38 compute-2 sudo[125347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:39 compute-2 python3.9[125349]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:39 compute-2 sudo[125347]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:39.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:39 compute-2 ceph-mon[74913]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:54:39 compute-2 sudo[125472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljsnwjqchbnafqnesxikixnxubnoyahp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090078.411788-613-161270723694536/AnsiballZ_copy.py'
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:54:39 compute-2 sudo[125472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:54:39 compute-2 python3.9[125486]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090078.411788-613-161270723694536/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:39 compute-2 sudo[125472]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0000df0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0000df0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:40 compute-2 sudo[125640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjfxgfqdcskyokfnyqazdvtestywhsia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090080.2214205-658-148574744972636/AnsiballZ_file.py'
Oct 10 09:54:40 compute-2 sudo[125640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:40.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:40 compute-2 python3.9[125642]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:40 compute-2 sudo[125640]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:41.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:41 compute-2 sudo[125793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkskpveqkrcopevaokdffzoqorjndbbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090080.9938483-683-105994136316351/AnsiballZ_command.py'
Oct 10 09:54:41 compute-2 sudo[125793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:41 compute-2 python3.9[125795]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:41 compute-2 sudo[125793]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:41 compute-2 ceph-mon[74913]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 09:54:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:41 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:42 compute-2 sudo[125948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lncyqeywdclplsmydsmphszggvuqhrlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090081.850931-707-242311826245526/AnsiballZ_blockinfile.py'
Oct 10 09:54:42 compute-2 sudo[125948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095442 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:42 compute-2 python3.9[125950]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:54:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:54:42 compute-2 sudo[125948]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:42.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:43.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:43 compute-2 sudo[126102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neglwkilxavskgldizygzurszwddifjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090082.92085-733-151864049966921/AnsiballZ_command.py'
Oct 10 09:54:43 compute-2 sudo[126102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:43 compute-2 python3.9[126104]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:43 compute-2 sudo[126102]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:43 compute-2 ceph-mon[74913]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Oct 10 09:54:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:43 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c000ea0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:44 compute-2 sudo[126255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzsmpuzswobtrispkrzcbznbzjbxnrpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090083.8174236-758-106296532356649/AnsiballZ_stat.py'
Oct 10 09:54:44 compute-2 sudo[126255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:44 compute-2 python3.9[126257]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:54:44 compute-2 sudo[126255]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:44 compute-2 sudo[126411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvvsmwtfyzhhqfjbbgsbkuzogoonqiky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090084.623661-781-129160602477557/AnsiballZ_command.py'
Oct 10 09:54:44 compute-2 sudo[126411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:45 compute-2 python3.9[126413]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:45.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:45 compute-2 sudo[126411]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:45 compute-2 ceph-mon[74913]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Oct 10 09:54:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:45 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:54:45 compute-2 sudo[126566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkyhohfvutbxuzrsixrzjrpnuedjqzhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090085.3999288-806-211079326334561/AnsiballZ_file.py'
Oct 10 09:54:45 compute-2 sudo[126566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:45 compute-2 python3.9[126568]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:45 compute-2 sudo[126566]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:45 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998001b40 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:54:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:47 compute-2 python3.9[126720]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:54:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:54:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:47.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:54:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:47 compute-2 ceph-mon[74913]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Oct 10 09:54:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095447 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:54:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:47 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:48 compute-2 sudo[126871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qleyrgzzlzbzbnzjcfnctfxwavftfmpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090087.930014-925-67534477863999/AnsiballZ_command.py'
Oct 10 09:54:48 compute-2 sudo[126871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998001b40 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:48 compute-2 python3.9[126873]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c0:16:5a:16" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:48 compute-2 ovs-vsctl[126874]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c0:16:5a:16 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 10 09:54:48 compute-2 sudo[126871]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:49 compute-2 sudo[127026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-godfuyigtxhahlqdjcniygsjheniqngs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090088.7828364-952-39739369359861/AnsiballZ_command.py'
Oct 10 09:54:49 compute-2 sudo[127026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:49.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:49 compute-2 python3.9[127028]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:49 compute-2 sudo[127026]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:49 compute-2 ceph-mon[74913]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Oct 10 09:54:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:49 compute-2 sudo[127181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keasskxglbfnegtdushunbxyazpvrdjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090089.6403568-977-100782463073463/AnsiballZ_command.py'
Oct 10 09:54:49 compute-2 sudo[127181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:49 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:50 compute-2 python3.9[127183]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:54:50 compute-2 ovs-vsctl[127184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 10 09:54:50 compute-2 sudo[127181]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:51 compute-2 python3.9[127336]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:54:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:51.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:51 compute-2 ceph-mon[74913]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 09:54:51 compute-2 sudo[127488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdwzgcxertoftmsbtcfqajkvqwnfvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090091.379508-1028-49446709947956/AnsiballZ_file.py'
Oct 10 09:54:51 compute-2 sudo[127488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:51 compute-2 python3.9[127490]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:54:51 compute-2 sudo[127488]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:51 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:52 compute-2 sudo[127491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:54:52 compute-2 sudo[127491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:54:52 compute-2 sudo[127491]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:52 compute-2 sudo[127665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqaktujiodsjirvyvhluaxnldmlxiwof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090092.2048047-1051-36288918033706/AnsiballZ_stat.py'
Oct 10 09:54:52 compute-2 sudo[127665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:52 compute-2 python3.9[127667]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:52.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:52 compute-2 sudo[127665]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:52 compute-2 sudo[127745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loffmeyayrxnsjyevbuctmclfghwsqwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090092.2048047-1051-36288918033706/AnsiballZ_file.py'
Oct 10 09:54:52 compute-2 sudo[127745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:53 compute-2 python3.9[127747]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:54:53 compute-2 sudo[127745]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:54:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:53.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:54:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:53 compute-2 sudo[127897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvdxbyxnbvudxvxxxxijbzpyhccfeeul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090093.2770722-1051-26208188626146/AnsiballZ_stat.py'
Oct 10 09:54:53 compute-2 sudo[127897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:53 compute-2 ceph-mon[74913]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 09:54:53 compute-2 python3.9[127899]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:53 compute-2 sudo[127897]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:53 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:54 compute-2 sudo[127975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwdqvwgmhtspbfyfoowwyybicvwjqlfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090093.2770722-1051-26208188626146/AnsiballZ_file.py'
Oct 10 09:54:54 compute-2 sudo[127975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:54 compute-2 python3.9[127977]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:54:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:54 compute-2 sudo[127975]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:54.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:54 compute-2 sudo[128129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaqobaysrugfiowyuadgkitljtxngabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090094.6906433-1120-274095271819868/AnsiballZ_file.py'
Oct 10 09:54:54 compute-2 sudo[128129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:55 compute-2 python3.9[128131]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:55 compute-2 sudo[128129]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:54:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:55.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:54:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:55 compute-2 ceph-mon[74913]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 09:54:55 compute-2 sudo[128281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oorbklqofnzpzjqejmopwwffdtrjtxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090095.4788885-1145-85551175243810/AnsiballZ_stat.py'
Oct 10 09:54:55 compute-2 sudo[128281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:55 compute-2 python3.9[128283]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:55 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:56 compute-2 sudo[128281]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:56 compute-2 sudo[128359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhyoaclvgzrsikushmwurbdhdhwzdgpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090095.4788885-1145-85551175243810/AnsiballZ_file.py'
Oct 10 09:54:56 compute-2 sudo[128359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:56 compute-2 python3.9[128361]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:56 compute-2 sudo[128359]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:56.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:56 compute-2 sudo[128513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kynsehsetkduegdvmandyeyortkqfvdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090096.6324947-1181-204631691349148/AnsiballZ_stat.py'
Oct 10 09:54:56 compute-2 sudo[128513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:57 compute-2 python3.9[128515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:57 compute-2 sudo[128513]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:57.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:57 compute-2 sudo[128591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkovfjyhcyxflibeugysmommjjmkperg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090096.6324947-1181-204631691349148/AnsiballZ_file.py'
Oct 10 09:54:57 compute-2 sudo[128591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:57 compute-2 python3.9[128593]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:54:57 compute-2 sudo[128591]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:57 compute-2 ceph-mon[74913]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 09:54:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:57 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:58 compute-2 sudo[128743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xypcxwfdsjbsedwfgoojmcuwpdhsjbnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090097.97982-1217-120214238898249/AnsiballZ_systemd.py'
Oct 10 09:54:58 compute-2 sudo[128743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:54:58 compute-2 python3.9[128745]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:54:58 compute-2 systemd[1]: Reloading.
Oct 10 09:54:58 compute-2 systemd-sysv-generator[128773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:54:58 compute-2 systemd-rc-local-generator[128767]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:54:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:58.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:58 compute-2 sudo[128743]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:54:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:54:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:59.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:54:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:54:59 compute-2 sudo[128934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzlrjxbxtfivhvtaexievgyewprgchyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090099.2544699-1240-170937535980326/AnsiballZ_stat.py'
Oct 10 09:54:59 compute-2 sudo[128934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:54:59 compute-2 ceph-mon[74913]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 1 op/s
Oct 10 09:54:59 compute-2 python3.9[128936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:54:59 compute-2 sudo[128934]: pam_unix(sudo:session): session closed for user root
Oct 10 09:54:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:54:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:59 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:00 compute-2 sudo[129012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ormgsrkmmlsouykmpeqqrbpqffdvwbau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090099.2544699-1240-170937535980326/AnsiballZ_file.py'
Oct 10 09:55:00 compute-2 sudo[129012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:00 compute-2 python3.9[129014]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:00 compute-2 sudo[129012]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:00.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:00 compute-2 sudo[129166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbkeqkdcgfzvhwbutlujaospcbtnqjlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090100.6116843-1276-180429825178158/AnsiballZ_stat.py'
Oct 10 09:55:00 compute-2 sudo[129166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:01 compute-2 python3.9[129168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:01 compute-2 sudo[129166]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:01 compute-2 sudo[129244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yymfpumhksegvlajgxgklsgrsdoheavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090100.6116843-1276-180429825178158/AnsiballZ_file.py'
Oct 10 09:55:01 compute-2 sudo[129244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:01 compute-2 python3.9[129246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:01 compute-2 sudo[129244]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:01 compute-2 ceph-mon[74913]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:55:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:01 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:02 compute-2 sudo[129396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efchshdzmgdibgmffymowcwisouqccie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090101.8481357-1312-130554615250204/AnsiballZ_systemd.py'
Oct 10 09:55:02 compute-2 sudo[129396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:02 compute-2 python3.9[129398]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:55:02 compute-2 systemd[1]: Reloading.
Oct 10 09:55:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:02 compute-2 systemd-rc-local-generator[129426]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:55:02 compute-2 systemd-sysv-generator[129431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:55:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:02.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:02 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 09:55:02 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 09:55:02 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 09:55:02 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 09:55:02 compute-2 sudo[129396]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:03.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:03 compute-2 sudo[129593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmrpfcbwjpgyhnyxmarcpxkxvwmpyaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090103.2623591-1342-19725054404338/AnsiballZ_file.py'
Oct 10 09:55:03 compute-2 sudo[129593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:03 compute-2 python3.9[129595]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:03 compute-2 ceph-mon[74913]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:03 compute-2 sudo[129593]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:03 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:04 compute-2 sudo[129745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-honjyqvsdngsgsydrsthbvtwmxzcfqgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090104.0617633-1366-121764447244985/AnsiballZ_stat.py'
Oct 10 09:55:04 compute-2 sudo[129745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:04 compute-2 python3.9[129747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:04 compute-2 sudo[129745]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:04.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:05 compute-2 sudo[129870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwczabrfgfbwqmepyclvuueuaavrbid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090104.0617633-1366-121764447244985/AnsiballZ_copy.py'
Oct 10 09:55:05 compute-2 sudo[129870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:05.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:05 compute-2 python3.9[129872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090104.0617633-1366-121764447244985/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:05 compute-2 sudo[129870]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:05 compute-2 ceph-mon[74913]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:06 compute-2 sudo[130022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecaewpzqjvsendzdyqmrrxoxjrlzcqdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090105.8927896-1417-89175521421131/AnsiballZ_file.py'
Oct 10 09:55:06 compute-2 sudo[130022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:06 compute-2 python3.9[130024]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:06 compute-2 sudo[130022]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:06.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:07 compute-2 sudo[130176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fubmugcummrmvauujecftvaxmxvevlvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090106.755449-1441-17617002855457/AnsiballZ_stat.py'
Oct 10 09:55:07 compute-2 sudo[130176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:07.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:07 compute-2 python3.9[130178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:07 compute-2 sudo[130176]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:07 compute-2 sudo[130299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snqrlodifrtwgvdrawfjxwkbcggxynvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090106.755449-1441-17617002855457/AnsiballZ_copy.py'
Oct 10 09:55:07 compute-2 sudo[130299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:07 compute-2 python3.9[130301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090106.755449-1441-17617002855457/.source.json _original_basename=.acs83w8s follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:07 compute-2 sudo[130299]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:07 compute-2 ceph-mon[74913]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:08 compute-2 sudo[130452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faweoazprawetspmdvrzygjnhcyxeajb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090108.2682588-1487-8358744816628/AnsiballZ_file.py'
Oct 10 09:55:08 compute-2 sudo[130452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:55:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:08.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:55:08 compute-2 python3.9[130454]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:08 compute-2 sudo[130452]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:09.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:09 compute-2 sudo[130605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymnwojldfceyydmprfjpoufzfzwmclxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090109.2025235-1510-239797917475395/AnsiballZ_stat.py'
Oct 10 09:55:09 compute-2 sudo[130605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:09 compute-2 sudo[130605]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:09 compute-2 ceph-mon[74913]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:55:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:10 compute-2 sudo[130728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsnmelrfsqbyzpsadutifuryzarhqjtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090109.2025235-1510-239797917475395/AnsiballZ_copy.py'
Oct 10 09:55:10 compute-2 sudo[130728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:10 compute-2 sudo[130728]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:10.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:11 compute-2 sudo[130882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktnkzfecbxazdaemxhhbcjjdynvmkdds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090110.7204068-1561-25622692350866/AnsiballZ_container_config_data.py'
Oct 10 09:55:11 compute-2 sudo[130882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:11.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:11 compute-2 python3.9[130884]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 10 09:55:11 compute-2 sudo[130882]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:11 compute-2 ceph-mon[74913]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:12 compute-2 sudo[130977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:55:12 compute-2 sudo[130977]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:12 compute-2 sudo[130977]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:12 compute-2 sudo[131062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxgdltkfoilxtvavrshhqpodkmqteyui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090111.6784205-1588-176563613136199/AnsiballZ_container_config_hash.py'
Oct 10 09:55:12 compute-2 sudo[131062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:12 compute-2 python3.9[131064]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 09:55:12 compute-2 sudo[131062]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8000fa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:12.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:13.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:13 compute-2 sudo[131216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cekvhzwrwozgvdnxyecoszixxdgyoiyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090112.7433836-1615-245677923999608/AnsiballZ_podman_container_info.py'
Oct 10 09:55:13 compute-2 sudo[131216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:13 compute-2 python3.9[131218]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 09:55:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:13 compute-2 sudo[131216]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:13 compute-2 ceph-mon[74913]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:15 compute-2 sudo[131395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khbxecuoixgjzdtyohfzumqxkamtuddw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090114.513281-1654-33075091981813/AnsiballZ_edpm_container_manage.py'
Oct 10 09:55:15 compute-2 sudo[131395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:15.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:15 compute-2 python3[131397]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 09:55:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:15 compute-2 ceph-mon[74913]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001aa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:16.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:55:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:17.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:17 compute-2 ceph-mon[74913]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001aa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:18.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:18 compute-2 ceph-mon[74913]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:55:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:19.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:19 compute-2 podman[131410]: 2025-10-10 09:55:19.746151897 +0000 UTC m=+4.366179383 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 09:55:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 09:55:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2498 writes, 14K keys, 2498 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2498 writes, 2498 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2498 writes, 14K keys, 2498 commit groups, 1.0 writes per commit group, ingest: 37.80 MB, 0.06 MB/s
                                           Interval WAL: 2498 writes, 2498 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    178.0      0.12              0.05         6    0.019       0      0       0.0       0.0
                                             L6      1/0   12.11 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.9    176.1    154.4      0.39              0.18         5    0.078     21K   2261       0.0       0.0
                                            Sum      1/0   12.11 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    135.8    159.8      0.51              0.24        11    0.046     21K   2261       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    136.3    160.4      0.50              0.24        10    0.050     21K   2261       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    176.1    154.4      0.39              0.18         5    0.078     21K   2261       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    180.8      0.11              0.05         5    0.023       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.020, interval 0.020
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.5 seconds
                                           Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 2.60 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(170,2.40 MB,0.789241%) FilterBlock(11,69.05 KB,0.0221805%) IndexBlock(11,132.45 KB,0.0425489%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Oct 10 09:55:19 compute-2 podman[131529]: 2025-10-10 09:55:19.872145646 +0000 UTC m=+0.042888873 container create 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 09:55:19 compute-2 podman[131529]: 2025-10-10 09:55:19.848861791 +0000 UTC m=+0.019605038 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 09:55:19 compute-2 python3[131397]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 09:55:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:19 compute-2 sudo[131395]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:20 compute-2 sudo[131591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:55:20 compute-2 sudo[131591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:20 compute-2 sudo[131591]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:20 compute-2 sudo[131616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Oct 10 09:55:20 compute-2 sudo[131616]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80027b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:20 compute-2 sudo[131616]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:20 compute-2 sudo[131665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:55:20 compute-2 sudo[131665]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:20 compute-2 sudo[131665]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:20 compute-2 sudo[131690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:55:20 compute-2 sudo[131690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:21 compute-2 sudo[131690]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:21 compute-2 ceph-mon[74913]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:55:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:55:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:22.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:22 compute-2 sudo[131874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhfsjkplsyjrlfzjgswagxlwpjpzjag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090122.5977387-1679-24635774830685/AnsiballZ_stat.py'
Oct 10 09:55:22 compute-2 sudo[131874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:23 compute-2 python3.9[131876]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:55:23 compute-2 sudo[131874]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:23 compute-2 ceph-mon[74913]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:23 compute-2 sudo[132028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcysawvsicntkasxuweedquqovccjdnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090123.4410365-1705-170384138812507/AnsiballZ_file.py'
Oct 10 09:55:23 compute-2 sudo[132028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:23 compute-2 python3.9[132030]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:23 compute-2 sudo[132028]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:24 compute-2 sudo[132104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncsptnlpcxowtshfgactumnsfzvnunpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090123.4410365-1705-170384138812507/AnsiballZ_stat.py'
Oct 10 09:55:24 compute-2 sudo[132104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:24 compute-2 python3.9[132106]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:55:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:24 compute-2 sudo[132104]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:24 compute-2 sudo[132256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-memkpuiusculhruwqjxfuetlgpgifemr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090124.3119717-1705-252320942057628/AnsiballZ_copy.py'
Oct 10 09:55:24 compute-2 sudo[132256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:24.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:24 compute-2 python3.9[132258]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090124.3119717-1705-252320942057628/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:55:24 compute-2 sudo[132256]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:25 compute-2 sudo[132333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbgscrgjifpmumyytmzhrjtbvugelpmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090124.3119717-1705-252320942057628/AnsiballZ_systemd.py'
Oct 10 09:55:25 compute-2 sudo[132333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:25 compute-2 python3.9[132335]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 09:55:25 compute-2 systemd[1]: Reloading.
Oct 10 09:55:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:25 compute-2 systemd-sysv-generator[132366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:55:25 compute-2 systemd-rc-local-generator[132363]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:55:25 compute-2 ceph-mon[74913]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:25 compute-2 sudo[132333]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:26 compute-2 sudo[132444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muxqamagamjtzrlwisomjqbpikuwxgtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090124.3119717-1705-252320942057628/AnsiballZ_systemd.py'
Oct 10 09:55:26 compute-2 sudo[132444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:26 compute-2 python3.9[132446]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:55:26 compute-2 systemd[1]: Reloading.
Oct 10 09:55:26 compute-2 systemd-sysv-generator[132477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:55:26 compute-2 systemd-rc-local-generator[132472]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:55:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990000d00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:26 compute-2 systemd[1]: Starting ovn_controller container...
Oct 10 09:55:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:26.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:26 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:55:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a1891d963d03ff5546418e96f4e624e3abb79d61408efcf4dc0a9f2f55e7ddc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 10 09:55:26 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3.
Oct 10 09:55:26 compute-2 podman[132488]: 2025-10-10 09:55:26.809649777 +0000 UTC m=+0.133595064 container init 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:55:26 compute-2 ovn_controller[132503]: + sudo -E kolla_set_configs
Oct 10 09:55:26 compute-2 podman[132488]: 2025-10-10 09:55:26.835147472 +0000 UTC m=+0.159092719 container start 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:55:26 compute-2 sudo[132507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:55:26 compute-2 edpm-start-podman-container[132488]: ovn_controller
Oct 10 09:55:26 compute-2 systemd[1]: Created slice User Slice of UID 0.
Oct 10 09:55:26 compute-2 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 10 09:55:26 compute-2 sudo[132507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:26 compute-2 sudo[132507]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:26 compute-2 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 10 09:55:26 compute-2 systemd[1]: Starting User Manager for UID 0...
Oct 10 09:55:26 compute-2 edpm-start-podman-container[132487]: Creating additional drop-in dependency for "ovn_controller" (470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3)
Oct 10 09:55:26 compute-2 systemd[132566]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 10 09:55:26 compute-2 podman[132534]: 2025-10-10 09:55:26.913855199 +0000 UTC m=+0.065538357 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 10 09:55:26 compute-2 systemd[1]: 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3-766bfbd5ad31caf8.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 09:55:26 compute-2 systemd[1]: 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3-766bfbd5ad31caf8.service: Failed with result 'exit-code'.
Oct 10 09:55:26 compute-2 systemd[1]: Reloading.
Oct 10 09:55:27 compute-2 systemd-sysv-generator[132614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:55:27 compute-2 systemd-rc-local-generator[132610]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:55:27 compute-2 systemd[132566]: Queued start job for default target Main User Target.
Oct 10 09:55:27 compute-2 systemd[132566]: Created slice User Application Slice.
Oct 10 09:55:27 compute-2 systemd[132566]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 10 09:55:27 compute-2 systemd[132566]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 09:55:27 compute-2 systemd[132566]: Reached target Paths.
Oct 10 09:55:27 compute-2 systemd[132566]: Reached target Timers.
Oct 10 09:55:27 compute-2 systemd[132566]: Starting D-Bus User Message Bus Socket...
Oct 10 09:55:27 compute-2 systemd[132566]: Starting Create User's Volatile Files and Directories...
Oct 10 09:55:27 compute-2 systemd[132566]: Finished Create User's Volatile Files and Directories.
Oct 10 09:55:27 compute-2 systemd[132566]: Listening on D-Bus User Message Bus Socket.
Oct 10 09:55:27 compute-2 systemd[132566]: Reached target Sockets.
Oct 10 09:55:27 compute-2 systemd[132566]: Reached target Basic System.
Oct 10 09:55:27 compute-2 systemd[132566]: Reached target Main User Target.
Oct 10 09:55:27 compute-2 systemd[132566]: Startup finished in 130ms.
Oct 10 09:55:27 compute-2 systemd[1]: Started User Manager for UID 0.
Oct 10 09:55:27 compute-2 systemd[1]: Started ovn_controller container.
Oct 10 09:55:27 compute-2 systemd[1]: Started Session c1 of User root.
Oct 10 09:55:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:27.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:27 compute-2 sudo[132444]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:27 compute-2 ovn_controller[132503]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 09:55:27 compute-2 ovn_controller[132503]: INFO:__main__:Validating config file
Oct 10 09:55:27 compute-2 ovn_controller[132503]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 09:55:27 compute-2 ovn_controller[132503]: INFO:__main__:Writing out command to execute
Oct 10 09:55:27 compute-2 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: ++ cat /run_command
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + ARGS=
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + sudo kolla_copy_cacerts
Oct 10 09:55:27 compute-2 systemd[1]: Started Session c2 of User root.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + [[ ! -n '' ]]
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + . kolla_extend_start
Oct 10 09:55:27 compute-2 ovn_controller[132503]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + umask 0022
Oct 10 09:55:27 compute-2 ovn_controller[132503]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 10 09:55:27 compute-2 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3396] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3405] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3418] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3423] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3427] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 09:55:27 compute-2 kernel: br-int: entered promiscuous mode
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 09:55:27 compute-2 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3558] manager: (ovn-38ab03-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct 10 09:55:27 compute-2 kernel: genev_sys_6081: entered promiscuous mode
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3712] device (genev_sys_6081): carrier: link connected
Oct 10 09:55:27 compute-2 NetworkManager[44866]: <info>  [1760090127.3714] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Oct 10 09:55:27 compute-2 systemd-udevd[132664]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:55:27 compute-2 systemd-udevd[132668]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 09:55:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:27 compute-2 ceph-mon[74913]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:27 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:27 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:55:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8002930 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:28 compute-2 NetworkManager[44866]: <info>  [1760090128.0331] manager: (ovn-a1a60c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Oct 10 09:55:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:28 compute-2 NetworkManager[44866]: <info>  [1760090128.5642] manager: (ovn-ee0899-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 10 09:55:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:28.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:28 compute-2 sudo[132798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niqpgbvdxsbidfxwmffsyqtsatcbfasf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090128.5358405-1790-159984076433791/AnsiballZ_command.py'
Oct 10 09:55:28 compute-2 sudo[132798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:29 compute-2 python3.9[132800]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:55:29 compute-2 ovs-vsctl[132801]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 10 09:55:29 compute-2 sudo[132798]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:29 compute-2 ceph-mon[74913]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:55:29 compute-2 sudo[132951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqbhvhzfitoahwtwlvlrnfxjjgffnmpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090129.4259624-1814-103276468346421/AnsiballZ_command.py'
Oct 10 09:55:29 compute-2 sudo[132951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:29 compute-2 python3.9[132953]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:55:29 compute-2 ovs-vsctl[132955]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 10 09:55:29 compute-2 sudo[132951]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:30.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:30 compute-2 sudo[133108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzjvwgbvumpfocwyjktuedkegjfyqym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090130.5288165-1855-22764645283768/AnsiballZ_command.py'
Oct 10 09:55:30 compute-2 sudo[133108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:31 compute-2 python3.9[133110]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:55:31 compute-2 ovs-vsctl[133111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 10 09:55:31 compute-2 sudo[133108]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:31.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:31 compute-2 sshd-session[121371]: Connection closed by 192.168.122.30 port 33150
Oct 10 09:55:31 compute-2 sshd-session[121366]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:55:31 compute-2 systemd[1]: session-50.scope: Deactivated successfully.
Oct 10 09:55:31 compute-2 systemd[1]: session-50.scope: Consumed 54.620s CPU time.
Oct 10 09:55:31 compute-2 systemd-logind[796]: Session 50 logged out. Waiting for processes to exit.
Oct 10 09:55:31 compute-2 systemd-logind[796]: Removed session 50.
Oct 10 09:55:31 compute-2 ceph-mon[74913]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:55:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:32 compute-2 sudo[133136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:55:32 compute-2 sudo[133136]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:32 compute-2 sudo[133136]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:32.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:33.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:33 compute-2 ceph-mon[74913]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:34.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 09:55:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:35.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:35 compute-2 ceph-mon[74913]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:36 compute-2 sshd-session[133166]: Accepted publickey for zuul from 192.168.122.30 port 42764 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:55:36 compute-2 systemd-logind[796]: New session 52 of user zuul.
Oct 10 09:55:36 compute-2 systemd[1]: Started Session 52 of User zuul.
Oct 10 09:55:36 compute-2 sshd-session[133166]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:55:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:37.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:37 compute-2 systemd[1]: Stopping User Manager for UID 0...
Oct 10 09:55:37 compute-2 systemd[132566]: Activating special unit Exit the Session...
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped target Main User Target.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped target Basic System.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped target Paths.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped target Sockets.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped target Timers.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 09:55:37 compute-2 systemd[132566]: Closed D-Bus User Message Bus Socket.
Oct 10 09:55:37 compute-2 systemd[132566]: Stopped Create User's Volatile Files and Directories.
Oct 10 09:55:37 compute-2 systemd[132566]: Removed slice User Application Slice.
Oct 10 09:55:37 compute-2 systemd[132566]: Reached target Shutdown.
Oct 10 09:55:37 compute-2 systemd[132566]: Finished Exit the Session.
Oct 10 09:55:37 compute-2 systemd[132566]: Reached target Exit the Session.
Oct 10 09:55:37 compute-2 systemd[1]: user@0.service: Deactivated successfully.
Oct 10 09:55:37 compute-2 systemd[1]: Stopped User Manager for UID 0.
Oct 10 09:55:37 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 10 09:55:37 compute-2 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 10 09:55:37 compute-2 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 10 09:55:37 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 10 09:55:37 compute-2 systemd[1]: Removed slice User Slice of UID 0.
Oct 10 09:55:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:37 compute-2 python3.9[133322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:55:37 compute-2 ceph-mon[74913]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:38.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:38 compute-2 sudo[133478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcsoplzjmvgrsztsgbekpmjayjlpacos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090138.463121-64-171359609924966/AnsiballZ_file.py'
Oct 10 09:55:38 compute-2 sudo[133478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:39 compute-2 python3.9[133480]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:39 compute-2 sudo[133478]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:39.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:39 compute-2 sudo[133630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsloyqefqaokliyhvlmqibrvtiejfphv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090139.3198068-64-23622749436219/AnsiballZ_file.py'
Oct 10 09:55:39 compute-2 sudo[133630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:39 compute-2 python3.9[133632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:39 compute-2 ceph-mon[74913]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:55:39 compute-2 sudo[133630]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:40 compute-2 sudo[133782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zczcqrlndeascqbzpswhwnapjjisrewc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090140.0183225-64-159516641032077/AnsiballZ_file.py'
Oct 10 09:55:40 compute-2 sudo[133782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:40 compute-2 python3.9[133784]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:40 compute-2 sudo[133782]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:40 compute-2 sudo[133936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygyvabewhtsyvjhidbkeaxlsacxvcsee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090140.5870588-64-87156953810233/AnsiballZ_file.py'
Oct 10 09:55:40 compute-2 sudo[133936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:41 compute-2 python3.9[133938]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:41 compute-2 sudo[133936]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:41.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:41 compute-2 sudo[134088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vacrughotajlkjuffetgphbewqxhlemx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090141.1864498-64-140000851390915/AnsiballZ_file.py'
Oct 10 09:55:41 compute-2 sudo[134088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:41 compute-2 python3.9[134090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:41 compute-2 sudo[134088]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:41 compute-2 ceph-mon[74913]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:42.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:43.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:43 compute-2 python3.9[134242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:55:43 compute-2 ceph-mon[74913]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:44 compute-2 sudo[134393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aafhemkxnfwokhtlgbtkpxiojjeciycf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090143.7301733-196-168114515499388/AnsiballZ_seboolean.py'
Oct 10 09:55:44 compute-2 sudo[134393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:44 compute-2 python3.9[134395]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 10 09:55:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:55:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:44.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:55:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:45 compute-2 sudo[134393]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:45.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:46 compute-2 ceph-mon[74913]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:46 compute-2 python3.9[134548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:55:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:46.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:55:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:55:47 compute-2 ceph-mon[74913]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:47 compute-2 python3.9[134671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090145.4117916-220-160631112383643/.source follow=False _original_basename=haproxy.j2 checksum=4bca74f6ee0b6450624d22997e2f90c414d58b44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:47.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:47 compute-2 python3.9[134821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:48.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:48 compute-2 python3.9[134942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090147.3316293-265-124786216155015/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:49 compute-2 sudo[135094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtgdybkmkvgxqoecpiuoksdlzjzmyrqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090149.1125646-316-245083796070596/AnsiballZ_setup.py'
Oct 10 09:55:49 compute-2 sudo[135094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:49 compute-2 ceph-mon[74913]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:55:49 compute-2 python3.9[135096]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:55:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 09:55:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 18.97 MB, 0.03 MB/s
                                           Interval WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 09:55:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:49 compute-2 sudo[135094]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:50 compute-2 sudo[135178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiastsbnaitwialtcomyntipksafegnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090149.1125646-316-245083796070596/AnsiballZ_dnf.py'
Oct 10 09:55:50 compute-2 sudo[135178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:50 compute-2 python3.9[135180]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:55:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:50.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:51 compute-2 ceph-mon[74913]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:51 compute-2 sudo[135178]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:52 compute-2 sudo[135234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:55:52 compute-2 sudo[135234]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:55:52 compute-2 sudo[135234]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:52.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:52 compute-2 sudo[135360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwojzhuztdnlothcqgtmgltscdjtxjvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090152.2200074-352-257516954693121/AnsiballZ_systemd.py'
Oct 10 09:55:52 compute-2 sudo[135360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:55:53 compute-2 python3.9[135362]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:55:53 compute-2 sudo[135360]: pam_unix(sudo:session): session closed for user root
Oct 10 09:55:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:53.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:53 compute-2 ceph-mon[74913]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:53 compute-2 python3.9[135515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:54 compute-2 python3.9[135636]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090153.534199-376-74726853277112/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:54.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:55:55 compute-2 python3.9[135788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:55.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:55 compute-2 ceph-mon[74913]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:55 compute-2 python3.9[135909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090154.6685617-376-124232097612305/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:57 compute-2 ovn_controller[132503]: 2025-10-10T09:55:57Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Oct 10 09:55:57 compute-2 ovn_controller[132503]: 2025-10-10T09:55:57Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct 10 09:55:57 compute-2 podman[136035]: 2025-10-10 09:55:57.174136243 +0000 UTC m=+0.087644527 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 09:55:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:55:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:55:57 compute-2 python3.9[136070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:57 compute-2 ceph-mon[74913]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:55:57 compute-2 python3.9[136208]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090156.8647535-509-52518455297501/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:58 compute-2 python3.9[136358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:55:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:55:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:55:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:58.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:55:58 compute-2 python3.9[136480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090157.8995488-509-235830547650296/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:55:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:55:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:55:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:59.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:55:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:55:59 compute-2 ceph-mon[74913]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:55:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:00 compute-2 python3.9[136631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:56:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:00.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:00 compute-2 sudo[136787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxraopzodgzfnllrbqkhsjqzknlzqjft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090160.5912051-622-260791314968349/AnsiballZ_file.py'
Oct 10 09:56:00 compute-2 sudo[136787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:01 compute-2 python3.9[136789]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:01 compute-2 sudo[136787]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:56:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:01.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:56:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:01 compute-2 ceph-mon[74913]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:56:01 compute-2 sudo[136939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faahpzovcjrjhqcrazcbimbkljwcahxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090161.415398-646-171335780962141/AnsiballZ_stat.py'
Oct 10 09:56:01 compute-2 sudo[136939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:01 compute-2 python3.9[136941]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:01 compute-2 sudo[136939]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095602 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:02 compute-2 sudo[137017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccyrwxsywdcohtmweuolgerljnigjlee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090161.415398-646-171335780962141/AnsiballZ_file.py'
Oct 10 09:56:02 compute-2 sudo[137017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:02 compute-2 python3.9[137019]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:02 compute-2 sudo[137017]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:02.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:02 compute-2 sudo[137171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olungighqthjuqtzwxemcsfyrpzmyekr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090162.5662367-646-187046029239867/AnsiballZ_stat.py'
Oct 10 09:56:02 compute-2 sudo[137171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:03 compute-2 python3.9[137173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:03 compute-2 sudo[137171]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:03.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:03 compute-2 sudo[137249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjmmfzqwvojuxioqcavixucciyoieadf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090162.5662367-646-187046029239867/AnsiballZ_file.py'
Oct 10 09:56:03 compute-2 sudo[137249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:03 compute-2 python3.9[137251]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:03 compute-2 ceph-mon[74913]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:03 compute-2 sudo[137249]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:04 compute-2 sudo[137401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daovniscjhjndlibofwgkzddadczifcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090164.0736153-715-222683823001770/AnsiballZ_file.py'
Oct 10 09:56:04 compute-2 sudo[137401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:04 compute-2 python3.9[137403]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:04 compute-2 sudo[137401]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:05 compute-2 sudo[137555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obxtpvkbnmmhzypdpmvvhnhyleinsxza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090164.8775012-739-214844330010331/AnsiballZ_stat.py'
Oct 10 09:56:05 compute-2 sudo[137555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:05.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:05 compute-2 python3.9[137557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:05 compute-2 sudo[137555]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:05 compute-2 sudo[137633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoisklqmnoehfdvocbmgkppkobyijbbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090164.8775012-739-214844330010331/AnsiballZ_file.py'
Oct 10 09:56:05 compute-2 sudo[137633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:05 compute-2 ceph-mon[74913]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:05 compute-2 python3.9[137635]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:05 compute-2 sudo[137633]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:06 compute-2 sudo[137785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzuebfdqmzanzifmfxxjcvdctwbnvxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090166.1930625-775-64650513932461/AnsiballZ_stat.py'
Oct 10 09:56:06 compute-2 sudo[137785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:06 compute-2 python3.9[137787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:06 compute-2 sudo[137785]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:06.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:06 compute-2 sudo[137865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgnpacjfeaucvueowmegjcjyyqmfgxig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090166.1930625-775-64650513932461/AnsiballZ_file.py'
Oct 10 09:56:06 compute-2 sudo[137865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:07 compute-2 python3.9[137867]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:07 compute-2 sudo[137865]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:56:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:07.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:56:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:07 compute-2 ceph-mon[74913]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:07 compute-2 sudo[138017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glliibspslscrkuszjplneulunwshqiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090167.5288637-811-83538291885750/AnsiballZ_systemd.py'
Oct 10 09:56:07 compute-2 sudo[138017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:08 compute-2 python3.9[138019]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:08 compute-2 systemd[1]: Reloading.
Oct 10 09:56:08 compute-2 systemd-sysv-generator[138052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:08 compute-2 systemd-rc-local-generator[138049]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:08 compute-2 sudo[138017]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 09:56:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:08.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 09:56:09 compute-2 sudo[138209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abkkavquqnvxgflyhqkfvqfpztsecnlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090168.8909876-835-263701774693484/AnsiballZ_stat.py'
Oct 10 09:56:09 compute-2 sudo[138209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:09.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:09 compute-2 python3.9[138211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:09 compute-2 sudo[138209]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:09 compute-2 sudo[138287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inmfzpumuqgojxyxojxthcjhokmvvqmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090168.8909876-835-263701774693484/AnsiballZ_file.py'
Oct 10 09:56:09 compute-2 sudo[138287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:09 compute-2 ceph-mon[74913]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:56:09 compute-2 python3.9[138289]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:09 compute-2 sudo[138287]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:10 compute-2 sudo[138439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hidexfwmifuyccimnxudriqcyimpxgwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090170.1978774-871-217125377222296/AnsiballZ_stat.py'
Oct 10 09:56:10 compute-2 sudo[138439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:56:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:10 compute-2 python3.9[138441]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:10 compute-2 sudo[138439]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:10.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:10 compute-2 sudo[138519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynblgvnvcxxeycdfoesagsupvockimta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090170.1978774-871-217125377222296/AnsiballZ_file.py'
Oct 10 09:56:10 compute-2 sudo[138519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:11 compute-2 python3.9[138521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:11 compute-2 sudo[138519]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:11.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:11 compute-2 ceph-mon[74913]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:11 compute-2 sudo[138671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxudnrxfklkfmsmoixmctcbzzucevfvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090171.5407882-908-152341204201346/AnsiballZ_systemd.py'
Oct 10 09:56:11 compute-2 sudo[138671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:12 compute-2 python3.9[138673]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:12 compute-2 systemd[1]: Reloading.
Oct 10 09:56:12 compute-2 systemd-rc-local-generator[138698]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:12 compute-2 systemd-sysv-generator[138703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:12 compute-2 sudo[138710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:56:12 compute-2 sudo[138710]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:12 compute-2 sudo[138710]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:12 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 09:56:12 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 09:56:12 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 09:56:12 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 09:56:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:12 compute-2 sudo[138671]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:13.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:13 compute-2 sudo[138892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwtrlpvawjcnxlxzqoxlqshmmopskiym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090173.0499873-937-134424335083741/AnsiballZ_file.py'
Oct 10 09:56:13 compute-2 sudo[138892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:13 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:56:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:13 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:56:13 compute-2 python3.9[138894]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:13 compute-2 sudo[138892]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:13 compute-2 ceph-mon[74913]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:56:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a450 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:14 compute-2 sudo[139044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uycsxzjhkvqxaociepflsxwzefzxyurm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090173.8720157-961-170085316790550/AnsiballZ_stat.py'
Oct 10 09:56:14 compute-2 sudo[139044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:14 compute-2 python3.9[139046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:14 compute-2 sudo[139044]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:14 compute-2 sudo[139168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxeyqxjvwiqpbgyfqnrgnxsqxpeoavjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090173.8720157-961-170085316790550/AnsiballZ_copy.py'
Oct 10 09:56:14 compute-2 sudo[139168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:14.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:14 compute-2 python3.9[139170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090173.8720157-961-170085316790550/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:14 compute-2 sudo[139168]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:15.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:15 compute-2 ceph-mon[74913]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:56:15 compute-2 sudo[139321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhsvcugpwjbbrxtmbleexpunonhhaibg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090175.57766-1013-246429712439510/AnsiballZ_file.py'
Oct 10 09:56:15 compute-2 sudo[139321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:16 compute-2 python3.9[139323]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:16 compute-2 sudo[139321]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a470 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:56:16 compute-2 sudo[139474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csjrqehnbcudbpdjetsrbwseyebzaqvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090176.413521-1036-128440861259022/AnsiballZ_stat.py'
Oct 10 09:56:16 compute-2 sudo[139474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:16.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:56:16 compute-2 python3.9[139476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:56:16 compute-2 sudo[139474]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:17 compute-2 sudo[139598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aehbcuqthzeucdzykpssswdizvcwgqdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090176.413521-1036-128440861259022/AnsiballZ_copy.py'
Oct 10 09:56:17 compute-2 sudo[139598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:17.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:17 compute-2 python3.9[139600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090176.413521-1036-128440861259022/.source.json _original_basename=._gvi0osk follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:17 compute-2 sudo[139598]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:17 compute-2 ceph-mon[74913]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 09:56:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:18 compute-2 sudo[139750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dntiljjfbopgopmmwlhxierfurryilar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090177.9126983-1081-264550121545308/AnsiballZ_file.py'
Oct 10 09:56:18 compute-2 sudo[139750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:18 compute-2 python3.9[139752]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:18 compute-2 sudo[139750]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a490 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:18.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:19 compute-2 sudo[139904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trtaoufvbxyhozjuelzgazrqsmttqmwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090178.8258505-1105-90285163407706/AnsiballZ_stat.py'
Oct 10 09:56:19 compute-2 sudo[139904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:19.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:19 compute-2 sudo[139904]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:19 compute-2 sudo[140027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmbovxdqieinlcwebtumipjydepvdofo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090178.8258505-1105-90285163407706/AnsiballZ_copy.py'
Oct 10 09:56:19 compute-2 sudo[140027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:19 compute-2 ceph-mon[74913]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:56:19 compute-2 sudo[140027]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:20.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:20 compute-2 sudo[140181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-httfohscwfxjbhsoikskhublteddakda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090180.3815374-1156-54821147780370/AnsiballZ_container_config_data.py'
Oct 10 09:56:20 compute-2 sudo[140181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:21 compute-2 python3.9[140183]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 10 09:56:21 compute-2 sudo[140181]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:21.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:21 compute-2 sudo[140333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuuxpfmlafotzeldseoemacncriibeot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090181.3741436-1183-91314005881532/AnsiballZ_container_config_hash.py'
Oct 10 09:56:21 compute-2 sudo[140333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:21 compute-2 ceph-mon[74913]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:56:22 compute-2 python3.9[140335]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 09:56:22 compute-2 sudo[140333]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095622 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:22.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:22 compute-2 sudo[140487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdcbwafgfpshozqpkyaldkwhqfyafvbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090182.4668348-1210-112782830650842/AnsiballZ_podman_container_info.py'
Oct 10 09:56:22 compute-2 sudo[140487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:23 compute-2 python3.9[140489]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 09:56:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:23.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:23 compute-2 sudo[140487]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:23 compute-2 ceph-mon[74913]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:56:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4d0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:24 compute-2 sudo[140667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llblpluqbfnenoxszjqisnqvvngmcqoj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090184.2833412-1249-185329504289859/AnsiballZ_edpm_container_manage.py'
Oct 10 09:56:24 compute-2 sudo[140667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:25 compute-2 python3[140669]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 09:56:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:25.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:25 compute-2 ceph-mon[74913]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:56:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:26.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:27 compute-2 sudo[140734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:56:27 compute-2 sudo[140734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:27 compute-2 sudo[140734]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:27 compute-2 sudo[140759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:56:27 compute-2 sudo[140759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:27.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:28 compute-2 ceph-mon[74913]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 09:56:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:28.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:29 compute-2 ceph-mon[74913]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 09:56:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:29.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a510 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:30.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:31.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:31 compute-2 podman[140795]: 2025-10-10 09:56:31.630396947 +0000 UTC m=+3.897463011 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 10 09:56:31 compute-2 ceph-mon[74913]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:56:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:56:31 compute-2 sudo[140759]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a530 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:32 compute-2 sudo[140894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:56:32 compute-2 sudo[140894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:32 compute-2 sudo[140894]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:33.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:33 compute-2 podman[140684]: 2025-10-10 09:56:33.468741492 +0000 UTC m=+8.268605852 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 09:56:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:33 compute-2 podman[140944]: 2025-10-10 09:56:33.653292332 +0000 UTC m=+0.077290252 container create 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 09:56:33 compute-2 podman[140944]: 2025-10-10 09:56:33.602401922 +0000 UTC m=+0.026399882 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 09:56:33 compute-2 python3[140669]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 09:56:33 compute-2 sudo[140667]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:34 compute-2 ceph-mon[74913]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:56:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:56:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000f30 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:34.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:35 compute-2 sudo[141134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnrclliywabwawtzlbtwmrkrryymzryf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090194.765838-1273-142960072922936/AnsiballZ_stat.py'
Oct 10 09:56:35 compute-2 sudo[141134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:35 compute-2 ceph-mon[74913]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:35 compute-2 python3.9[141136]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:56:35 compute-2 sudo[141134]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:35 compute-2 sudo[141288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqersmihnudfmshlbyfnksfqgrkkufl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090195.6689825-1300-234857333632839/AnsiballZ_file.py'
Oct 10 09:56:35 compute-2 sudo[141288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:36 compute-2 python3.9[141290]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:36 compute-2 sudo[141288]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:36 compute-2 sudo[141364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykohggenwxpjydtayrjzzoyjkzkolcsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090195.6689825-1300-234857333632839/AnsiballZ_stat.py'
Oct 10 09:56:36 compute-2 sudo[141364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:36 compute-2 python3.9[141366]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 09:56:36 compute-2 sudo[141364]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:37 compute-2 sudo[141517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxhwgvgpfvjemaojxkfldbnyzpyyirrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090196.7515676-1300-247127145947468/AnsiballZ_copy.py'
Oct 10 09:56:37 compute-2 sudo[141517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:37.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:37 compute-2 python3.9[141519]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090196.7515676-1300-247127145947468/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:56:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:37 compute-2 sudo[141517]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:37 compute-2 ceph-mon[74913]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:37 compute-2 sudo[141593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yopfjljrcaqaukfgclsmmfugifmusddb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090196.7515676-1300-247127145947468/AnsiballZ_systemd.py'
Oct 10 09:56:37 compute-2 sudo[141593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:38 compute-2 python3.9[141595]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 09:56:38 compute-2 systemd[1]: Reloading.
Oct 10 09:56:38 compute-2 systemd-rc-local-generator[141618]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000f30 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:38 compute-2 systemd-sysv-generator[141624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:38 compute-2 sudo[141593]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80023c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:38 compute-2 sudo[141684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:56:38 compute-2 sudo[141727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwmxxgmxwkpkbrzwhswaqukvqzhjlgsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090196.7515676-1300-247127145947468/AnsiballZ_systemd.py'
Oct 10 09:56:38 compute-2 sudo[141684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:38 compute-2 sudo[141727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:38 compute-2 sudo[141684]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:38.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:38 compute-2 python3.9[141731]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:38 compute-2 systemd[1]: Reloading.
Oct 10 09:56:39 compute-2 systemd-sysv-generator[141763]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:39 compute-2 systemd-rc-local-generator[141754]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:39 compute-2 systemd[1]: Starting ovn_metadata_agent container...
Oct 10 09:56:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:39.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:39 compute-2 systemd[1]: Started libcrun container.
Oct 10 09:56:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2325f8d451b5d0b2b4e3183f8e0614a7d17eb52f78e5487ff9f04d9d9849509f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2325f8d451b5d0b2b4e3183f8e0614a7d17eb52f78e5487ff9f04d9d9849509f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:39 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d.
Oct 10 09:56:39 compute-2 podman[141774]: 2025-10-10 09:56:39.443797781 +0000 UTC m=+0.152845676 container init 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + sudo -E kolla_set_configs
Oct 10 09:56:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:56:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:56:39 compute-2 ceph-mon[74913]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 09:56:39 compute-2 podman[141774]: 2025-10-10 09:56:39.478197052 +0000 UTC m=+0.187244917 container start 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 09:56:39 compute-2 edpm-start-podman-container[141774]: ovn_metadata_agent
Oct 10 09:56:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 09:56:39 compute-2 podman[141797]: 2025-10-10 09:56:39.535779867 +0000 UTC m=+0.048701124 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Validating config file
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Copying service configuration files
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Writing out command to execute
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 10 09:56:39 compute-2 edpm-start-podman-container[141773]: Creating additional drop-in dependency for "ovn_metadata_agent" (2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d)
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: ++ cat /run_command
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + CMD=neutron-ovn-metadata-agent
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + ARGS=
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + sudo kolla_copy_cacerts
Oct 10 09:56:39 compute-2 systemd[1]: Reloading.
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + [[ ! -n '' ]]
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + . kolla_extend_start
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: Running command: 'neutron-ovn-metadata-agent'
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + umask 0022
Oct 10 09:56:39 compute-2 ovn_metadata_agent[141790]: + exec neutron-ovn-metadata-agent
Oct 10 09:56:39 compute-2 systemd-sysv-generator[141869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:39 compute-2 systemd-rc-local-generator[141866]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:39 compute-2 systemd[1]: Started ovn_metadata_agent container.
Oct 10 09:56:39 compute-2 sudo[141727]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c001f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:40.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:41 compute-2 sshd-session[133169]: Connection closed by 192.168.122.30 port 42764
Oct 10 09:56:41 compute-2 sshd-session[133166]: pam_unix(sshd:session): session closed for user zuul
Oct 10 09:56:41 compute-2 systemd[1]: session-52.scope: Deactivated successfully.
Oct 10 09:56:41 compute-2 systemd[1]: session-52.scope: Consumed 53.759s CPU time.
Oct 10 09:56:41 compute-2 systemd-logind[796]: Session 52 logged out. Waiting for processes to exit.
Oct 10 09:56:41 compute-2 systemd-logind[796]: Removed session 52.
Oct 10 09:56:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:41.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.389 141795 INFO neutron.common.config [-] Logging enabled!
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.443 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.455 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 49146ebb-575d-4bd4-816c-0b242fb944ee (UUID: 49146ebb-575d-4bd4-816c-0b242fb944ee) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.479 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.484 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.492 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.498 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '49146ebb-575d-4bd4-816c-0b242fb944ee'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], external_ids={}, name=49146ebb-575d-4bd4-816c-0b242fb944ee, nb_cfg_timestamp=1760090135358, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.500 141795 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fa4a23bdf70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Oct 10 09:56:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 INFO oslo_service.service [-] Starting 1 workers
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.506 141795 DEBUG oslo_service.service [-] Started child 141903 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.509 141903 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-361953'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.510 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpsqkyazod/privsep.sock']
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.528 141903 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.529 141903 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.529 141903 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.532 141903 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.538 141903 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 10 09:56:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.543 141903 INFO eventlet.wsgi.server [-] (141903) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Oct 10 09:56:41 compute-2 ceph-mon[74913]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:42 compute-2 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 10 09:56:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80023c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.175 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.176 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpsqkyazod/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.046 141908 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.052 141908 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.055 141908 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.055 141908 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141908
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.179 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[01e9a910-34c8-48ab-845c-f8e4b2b45d8a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 09:56:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:56:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c001f70 fd 41 proxy ignored for local
Oct 10 09:56:42 compute-2 kernel: ganesha.nfsd[140891]: segfault at 50 ip 00007fba6c89632e sp 00007fba297f9210 error 4 in libntirpc.so.5.8[7fba6c87b000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 09:56:42 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 09:56:42 compute-2 systemd[1]: Started Process Core Dump (PID 141914/UID 0).
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.679 141908 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.679 141908 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 09:56:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.680 141908 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 09:56:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:42.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:43 compute-2 ceph-mon[74913]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.213 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[f5026b2d-e834-4424-a4d3-e3d6c094a10a]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.215 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, column=external_ids, values=({'neutron:ovn-metadata-id': 'cc7418c8-610c-5a79-bc13-35d330f4cf3b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.223 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 09:56:43 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 10 09:56:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:43.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:44.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:44 compute-2 systemd-coredump[141915]: Process 122579 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 63:
                                                    #0  0x00007fba6c89632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 09:56:44 compute-2 systemd[1]: systemd-coredump@3-141914-0.service: Deactivated successfully.
Oct 10 09:56:44 compute-2 systemd[1]: systemd-coredump@3-141914-0.service: Consumed 1.186s CPU time.
Oct 10 09:56:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:44 compute-2 podman[141923]: 2025-10-10 09:56:44.961763172 +0000 UTC m=+0.025568251 container died 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 09:56:44 compute-2 systemd[1]: var-lib-containers-storage-overlay-e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8-merged.mount: Deactivated successfully.
Oct 10 09:56:45 compute-2 podman[141923]: 2025-10-10 09:56:45.004555239 +0000 UTC m=+0.068360308 container remove 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 09:56:45 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 09:56:45 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 09:56:45 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.690s CPU time.
Oct 10 09:56:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:45.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:45 compute-2 ceph-mon[74913]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:46 compute-2 sshd-session[141967]: Accepted publickey for zuul from 192.168.122.30 port 39214 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 09:56:46 compute-2 systemd-logind[796]: New session 53 of user zuul.
Oct 10 09:56:46 compute-2 systemd[1]: Started Session 53 of User zuul.
Oct 10 09:56:46 compute-2 sshd-session[141967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 09:56:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:56:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:46.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:47.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:47 compute-2 python3.9[142122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 09:56:47 compute-2 ceph-mon[74913]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:48 compute-2 sudo[142277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjvmqhtmyslatlmpfhdrllhhinetclif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090208.305603-64-11067723559263/AnsiballZ_command.py'
Oct 10 09:56:48 compute-2 sudo[142277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:48.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:48 compute-2 python3.9[142279]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:56:49 compute-2 sudo[142277]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:49.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:49 compute-2 ceph-mon[74913]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 09:56:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:50 compute-2 sudo[142443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tueawnysoccukxndmhndppnmsnwmxbpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090209.5443249-97-256350593397626/AnsiballZ_systemd_service.py'
Oct 10 09:56:50 compute-2 sudo[142443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:50 compute-2 python3.9[142445]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 09:56:50 compute-2 systemd[1]: Reloading.
Oct 10 09:56:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:50 compute-2 systemd-rc-local-generator[142472]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:56:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095650 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:56:50 compute-2 systemd-sysv-generator[142475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:56:50 compute-2 sudo[142443]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:50.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:51.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:51 compute-2 python3.9[142632]: ansible-ansible.builtin.service_facts Invoked
Oct 10 09:56:51 compute-2 network[142649]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 09:56:51 compute-2 network[142650]: 'network-scripts' will be removed from distribution in near future.
Oct 10 09:56:51 compute-2 network[142651]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 09:56:51 compute-2 ceph-mon[74913]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:52 compute-2 sudo[142676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:56:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:52.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:52 compute-2 sudo[142676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:56:52 compute-2 sudo[142676]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:56:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:53.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:56:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:53 compute-2 ceph-mon[74913]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:56:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:54.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:55 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 4.
Oct 10 09:56:55 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:56:55 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.690s CPU time.
Oct 10 09:56:55 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:56:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:55.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:55 compute-2 podman[142832]: 2025-10-10 09:56:55.390370723 +0000 UTC m=+0.047161976 container create b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:56:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:55 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:56:55 compute-2 podman[142832]: 2025-10-10 09:56:55.443795087 +0000 UTC m=+0.100586350 container init b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 09:56:55 compute-2 podman[142832]: 2025-10-10 09:56:55.450138988 +0000 UTC m=+0.106930241 container start b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Oct 10 09:56:55 compute-2 bash[142832]: b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d
Oct 10 09:56:55 compute-2 podman[142832]: 2025-10-10 09:56:55.369095329 +0000 UTC m=+0.025886582 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:56:55 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:56:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:56:55 compute-2 ceph-mon[74913]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:56 compute-2 sudo[143046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkxuyzwckethpbxxtartqaezklqyrms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090216.0394628-154-17293860009472/AnsiballZ_systemd_service.py'
Oct 10 09:56:56 compute-2 sudo[143046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:56 compute-2 python3.9[143048]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:56 compute-2 sudo[143046]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:56.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:57 compute-2 sudo[143201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhgkhlvflbrzyazljlsfvpcufzaalvkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090216.8079708-154-263407694950013/AnsiballZ_systemd_service.py'
Oct 10 09:56:57 compute-2 sudo[143201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:57.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:57 compute-2 python3.9[143203]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:57 compute-2 sudo[143201]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:57 compute-2 sudo[143354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flnqdhatntdmoycvpmzvazgfnsoowepa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090217.531687-154-84027401316371/AnsiballZ_systemd_service.py'
Oct 10 09:56:57 compute-2 sudo[143354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:57 compute-2 ceph-mon[74913]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:56:58 compute-2 python3.9[143356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:58.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:59 compute-2 sudo[143354]: pam_unix(sudo:session): session closed for user root
Oct 10 09:56:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:56:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:56:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:59.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:56:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:56:59 compute-2 sudo[143509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovqpacvsurztkfedculfgrxptkiuveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090219.318202-154-115974979183009/AnsiballZ_systemd_service.py'
Oct 10 09:56:59 compute-2 sudo[143509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:56:59 compute-2 ceph-mon[74913]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:56:59 compute-2 python3.9[143511]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:56:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:56:59 compute-2 sudo[143509]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:00 compute-2 sudo[143662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdozmttrrxrerywkhpwilcwiyikfpech ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090220.102515-154-280255872030489/AnsiballZ_systemd_service.py'
Oct 10 09:57:00 compute-2 sudo[143662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:00 compute-2 python3.9[143664]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:57:00 compute-2 sudo[143662]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:00.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:01 compute-2 sudo[143817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnujiqqfpsfluapdurvvrugxmcfxmcce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090220.886048-154-15562968084310/AnsiballZ_systemd_service.py'
Oct 10 09:57:01 compute-2 sudo[143817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:01.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:01 compute-2 python3.9[143819]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:57:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:01 compute-2 sudo[143817]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:01 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:57:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:01 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:57:01 compute-2 ceph-mon[74913]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:57:01 compute-2 sudo[143970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsbrbmcudizoygtoqlmzbdkmxfuquvvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090221.6461544-154-198825976627711/AnsiballZ_systemd_service.py'
Oct 10 09:57:01 compute-2 sudo[143970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:02 compute-2 python3.9[143972]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 09:57:02 compute-2 sudo[143970]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:02 compute-2 podman[144021]: 2025-10-10 09:57:02.821014861 +0000 UTC m=+0.096334916 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 09:57:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:02.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:03 compute-2 sudo[144154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwnwxlnkwpojrvkwjxpeqpzuepbshvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090222.7154477-310-76629685018257/AnsiballZ_file.py'
Oct 10 09:57:03 compute-2 sudo[144154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:03.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:03 compute-2 python3.9[144156]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:03 compute-2 sudo[144154]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Oct 10 09:57:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct 10 09:57:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 09:57:03 compute-2 sudo[144306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iufbaudrulxtzmbisnjtpuhwzhpjunxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090223.5307999-310-32841904692785/AnsiballZ_file.py'
Oct 10 09:57:03 compute-2 sudo[144306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:03 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Oct 10 09:57:03 compute-2 ceph-mon[74913]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:03 compute-2 python3.9[144308]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:03 compute-2 sudo[144306]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:04 compute-2 sudo[144458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhdmqqujvpouukrbreihsmozgnivosbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090224.108185-310-177059166409462/AnsiballZ_file.py'
Oct 10 09:57:04 compute-2 sudo[144458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:04 compute-2 python3.9[144460]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:04 compute-2 sudo[144458]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:04.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:04 compute-2 sudo[144612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itmwubtpkprpuwslgmwpavizfzheilvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090224.695782-310-142254687879832/AnsiballZ_file.py'
Oct 10 09:57:04 compute-2 sudo[144612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:05 compute-2 ceph-mon[74913]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:05 compute-2 python3.9[144614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:05 compute-2 sudo[144612]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:05.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:05 compute-2 sudo[144764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcyljlxigxmyrpziwkaltaaguiyycuuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090225.3177533-310-266931737461768/AnsiballZ_file.py'
Oct 10 09:57:05 compute-2 sudo[144764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:05 compute-2 python3.9[144766]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:05 compute-2 sudo[144764]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:06 compute-2 sudo[144916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyvngvssfinwftzeouvogqrusythwzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090226.0163057-310-7645175307472/AnsiballZ_file.py'
Oct 10 09:57:06 compute-2 sudo[144916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:06 compute-2 python3.9[144918]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:06 compute-2 sudo[144916]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:06.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:06 compute-2 sudo[145070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcrvzqxyiqxhjcpgywecgaeingolxblq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090226.6601243-310-107176081704818/AnsiballZ_file.py'
Oct 10 09:57:06 compute-2 sudo[145070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:07 compute-2 python3.9[145072]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:07 compute-2 sudo[145070]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:07.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:07 compute-2 ceph-mon[74913]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:57:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:57:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce44000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce38001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce20000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:08 compute-2 sudo[145238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zccbtqkbnxwkpidpfexvclinvfrdaobc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090228.520367-461-237192405705153/AnsiballZ_file.py'
Oct 10 09:57:08 compute-2 sudo[145238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:08.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:08 compute-2 python3.9[145240]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:09 compute-2 sudo[145238]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:09.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:09 compute-2 sudo[145391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihiozwhrjmgmsopfpuozfjvbgfspzphc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090229.1287105-461-91746801892089/AnsiballZ_file.py'
Oct 10 09:57:09 compute-2 sudo[145391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:09 compute-2 ceph-mon[74913]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 88 KiB/s rd, 1023 B/s wr, 146 op/s
Oct 10 09:57:09 compute-2 python3.9[145393]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:09 compute-2 sudo[145391]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:09 compute-2 podman[145402]: 2025-10-10 09:57:09.797009615 +0000 UTC m=+0.063596827 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 10 09:57:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:10 compute-2 sudo[145563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duzcxmdqdemxktucufnpqcsfdjcludiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090229.8071043-461-216868480767186/AnsiballZ_file.py'
Oct 10 09:57:10 compute-2 sudo[145563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce1c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:10 compute-2 python3.9[145565]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:10 compute-2 sudo[145563]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce28000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095710 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:57:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce38001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:10 compute-2 sudo[145716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkcqtoztrafvavhzwdaakhhrigqdqsfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090230.4137857-461-195248877891610/AnsiballZ_file.py'
Oct 10 09:57:10 compute-2 sudo[145716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:10.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:10 compute-2 python3.9[145718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:10 compute-2 sudo[145716]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:11 compute-2 sudo[145869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiiyeyzvcyfqufwwcqbqxqfsukaiascq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090231.0391052-461-170490142005646/AnsiballZ_file.py'
Oct 10 09:57:11 compute-2 sudo[145869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:11.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:11 compute-2 python3.9[145871]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:11 compute-2 sudo[145869]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:11 compute-2 ceph-mon[74913]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 87 KiB/s rd, 938 B/s wr, 145 op/s
Oct 10 09:57:11 compute-2 sudo[146021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpjdzhgvuswpendychpqiwmwdlbsjefb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090231.63241-461-155782240448796/AnsiballZ_file.py'
Oct 10 09:57:11 compute-2 sudo[146021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:12 compute-2 python3.9[146023]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:12 compute-2 sudo[146021]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce200016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:12 compute-2 sudo[146173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uguqabxpkjwwxefkancrhweloffdkcnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090232.2133787-461-277113907851400/AnsiballZ_file.py'
Oct 10 09:57:12 compute-2 sudo[146173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:12 compute-2 kernel: ganesha.nfsd[145102]: segfault at 50 ip 00007fcef39fe32e sp 00007fcec0ff8210 error 4 in libntirpc.so.5.8[7fcef39e3000+2c000] likely on CPU 1 (core 0, socket 1)
Oct 10 09:57:12 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 09:57:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce28001ac0 fd 39 proxy ignored for local
Oct 10 09:57:12 compute-2 systemd[1]: Started Process Core Dump (PID 146177/UID 0).
Oct 10 09:57:12 compute-2 python3.9[146175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:57:12 compute-2 sudo[146173]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:12 compute-2 sudo[146204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:57:12 compute-2 sudo[146204]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:12 compute-2 sudo[146204]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:13.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:13 compute-2 ceph-mon[74913]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 87 KiB/s rd, 938 B/s wr, 145 op/s
Oct 10 09:57:13 compute-2 systemd-coredump[146178]: Process 142857 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 45:
                                                    #0  0x00007fcef39fe32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 09:57:13 compute-2 systemd[1]: systemd-coredump@4-146177-0.service: Deactivated successfully.
Oct 10 09:57:13 compute-2 systemd[1]: systemd-coredump@4-146177-0.service: Consumed 1.162s CPU time.
Oct 10 09:57:13 compute-2 podman[146233]: 2025-10-10 09:57:13.868959555 +0000 UTC m=+0.023038791 container died b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Oct 10 09:57:13 compute-2 systemd[1]: var-lib-containers-storage-overlay-1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6-merged.mount: Deactivated successfully.
Oct 10 09:57:13 compute-2 podman[146233]: 2025-10-10 09:57:13.917133392 +0000 UTC m=+0.071212628 container remove b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0)
Oct 10 09:57:13 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 09:57:14 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 09:57:14 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.353s CPU time.
Oct 10 09:57:14 compute-2 sudo[146401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmetodqzajhzwbweyytlagcpsghhyvxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090233.8748908-613-197425096912601/AnsiballZ_command.py'
Oct 10 09:57:14 compute-2 sudo[146401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:14 compute-2 python3.9[146403]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:14 compute-2 sudo[146401]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:14.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:15 compute-2 python3.9[146557]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 09:57:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:15.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:15 compute-2 ceph-mon[74913]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 85 B/s wr, 143 op/s
Oct 10 09:57:16 compute-2 sudo[146707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uinjsjyexpycpattkiefztmbqnbeuthb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090235.7321754-668-141802836056539/AnsiballZ_systemd_service.py'
Oct 10 09:57:16 compute-2 sudo[146707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:16 compute-2 python3.9[146709]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 09:57:16 compute-2 systemd[1]: Reloading.
Oct 10 09:57:16 compute-2 systemd-rc-local-generator[146735]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:57:16 compute-2 systemd-sysv-generator[146740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:57:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:57:16 compute-2 sudo[146707]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:16.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:17 compute-2 sudo[146896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmezdyebwejbnfzgukovxatpqdmwuymf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090237.0242805-691-93013969782736/AnsiballZ_command.py'
Oct 10 09:57:17 compute-2 sudo[146896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:17.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:17 compute-2 python3.9[146898]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:17 compute-2 sudo[146896]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:17 compute-2 ceph-mon[74913]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 85 B/s wr, 143 op/s
Oct 10 09:57:17 compute-2 sudo[147049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eniqfmxuspzbocxoewqxnztcchtnznvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090237.6109867-691-254362997072857/AnsiballZ_command.py'
Oct 10 09:57:17 compute-2 sudo[147049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:18 compute-2 python3.9[147051]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:18 compute-2 sudo[147049]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:18 compute-2 sudo[147203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxvpwkdfsczydnpujyhrkyzkrofkwjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090238.2792509-691-209101341223210/AnsiballZ_command.py'
Oct 10 09:57:18 compute-2 sudo[147203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095718 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 09:57:18 compute-2 python3.9[147205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:18 compute-2 sudo[147203]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:18.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:19 compute-2 sudo[147357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gixmisiorsvhxpqisogvdasbyflzqpmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090238.8708081-691-174832500781264/AnsiballZ_command.py'
Oct 10 09:57:19 compute-2 sudo[147357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:19.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:19 compute-2 python3.9[147359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:19 compute-2 sudo[147357]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:19 compute-2 ceph-mon[74913]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 85 B/s wr, 143 op/s
Oct 10 09:57:19 compute-2 sudo[147510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzameiaslbtjuhlyjdgquxgeyagxbmcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090239.5583248-691-26423664343515/AnsiballZ_command.py'
Oct 10 09:57:19 compute-2 sudo[147510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:20 compute-2 python3.9[147512]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:20 compute-2 sudo[147510]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:20 compute-2 sudo[147663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuaaqwegybznwrynmztpnulrmtntfcrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090240.164405-691-280435840233598/AnsiballZ_command.py'
Oct 10 09:57:20 compute-2 sudo[147663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:20 compute-2 python3.9[147665]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:20 compute-2 sudo[147663]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:20.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:21 compute-2 sudo[147818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dliaklhjdtrfijmnagliecanmgdweyry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090240.8173933-691-174130259759040/AnsiballZ_command.py'
Oct 10 09:57:21 compute-2 sudo[147818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:21 compute-2 python3.9[147820]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 09:57:21 compute-2 sudo[147818]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:21.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:21 compute-2 ceph-mon[74913]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:57:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:22.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:23 compute-2 sudo[147973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qletdjouovdngzbmdprsdigjfjpmkeig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090243.009352-854-81778590009700/AnsiballZ_getent.py'
Oct 10 09:57:23 compute-2 sudo[147973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:23.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:23 compute-2 python3.9[147975]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 10 09:57:23 compute-2 sudo[147973]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:23 compute-2 ceph-mon[74913]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 09:57:24 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 5.
Oct 10 09:57:24 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:57:24 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.353s CPU time.
Oct 10 09:57:24 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 09:57:24 compute-2 sudo[148168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbazbyavhixrcmbzetoxmffwmhwxgbfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090243.9084215-877-150210289627106/AnsiballZ_group.py'
Oct 10 09:57:24 compute-2 sudo[148168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:24 compute-2 podman[148170]: 2025-10-10 09:57:24.449139259 +0000 UTC m=+0.043483589 container create d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 09:57:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 09:57:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 09:57:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:57:24 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:24 compute-2 podman[148170]: 2025-10-10 09:57:24.519465129 +0000 UTC m=+0.113809479 container init d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 09:57:24 compute-2 podman[148170]: 2025-10-10 09:57:24.428417653 +0000 UTC m=+0.022762003 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 09:57:24 compute-2 podman[148170]: 2025-10-10 09:57:24.524486698 +0000 UTC m=+0.118831028 container start d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default)
Oct 10 09:57:24 compute-2 bash[148170]: d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 09:57:24 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 09:57:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 09:57:24 compute-2 python3.9[148171]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 09:57:24 compute-2 groupadd[148229]: group added to /etc/group: name=libvirt, GID=42473
Oct 10 09:57:24 compute-2 groupadd[148229]: group added to /etc/gshadow: name=libvirt
Oct 10 09:57:24 compute-2 groupadd[148229]: new group: name=libvirt, GID=42473
Oct 10 09:57:24 compute-2 sudo[148168]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:24.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:25.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:25 compute-2 sudo[148385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgdjxgvywbwapqiogbxlmfinnwyrmenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090244.9353008-901-138223569841282/AnsiballZ_user.py'
Oct 10 09:57:25 compute-2 sudo[148385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:25 compute-2 python3.9[148387]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 09:57:25 compute-2 ceph-mon[74913]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:57:25 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 09:57:25 compute-2 useradd[148389]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 10 09:57:25 compute-2 sudo[148385]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:26 compute-2 sudo[148547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjbwguroyysuqbjmzmzjlfnrdirhipfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090246.341853-934-253899206512286/AnsiballZ_setup.py'
Oct 10 09:57:26 compute-2 sudo[148547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:26.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:26 compute-2 python3.9[148549]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 09:57:27 compute-2 sudo[148547]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:27.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:27 compute-2 sudo[148632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jevxijkhvxiuzjzvdqjaopuyevdmabaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090246.341853-934-253899206512286/AnsiballZ_dnf.py'
Oct 10 09:57:27 compute-2 sudo[148632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:57:27 compute-2 ceph-mon[74913]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 09:57:27 compute-2 python3.9[148634]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 09:57:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:28.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:29.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:29 compute-2 ceph-mon[74913]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 09:57:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 09:57:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:30.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:31.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:31 compute-2 ceph-mon[74913]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:57:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:32.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:33 compute-2 sudo[148652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:57:33 compute-2 sudo[148652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:33 compute-2 sudo[148652]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:33 compute-2 podman[148676]: 2025-10-10 09:57:33.198129411 +0000 UTC m=+0.137700196 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 09:57:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:33.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:33 compute-2 ceph-mon[74913]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 09:57:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:34.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 09:57:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:35.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:35 compute-2 ceph-mon[74913]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 09:57:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 09:57:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:36.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:37.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:37 compute-2 ceph-mon[74913]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 09:57:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:38 compute-2 sudo[148830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:57:38 compute-2 sudo[148830]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:38 compute-2 sudo[148830]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:38 compute-2 sudo[148855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:57:38 compute-2 sudo[148855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:38.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:39 compute-2 sudo[148855]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:39.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:39 compute-2 ceph-mon[74913]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:57:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:57:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095740 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 09:57:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:40 compute-2 podman[148982]: 2025-10-10 09:57:40.788044933 +0000 UTC m=+0.060985630 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 10 09:57:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:40.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:41.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.444 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 09:57:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 09:57:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 09:57:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:41 compute-2 ceph-mon[74913]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 09:57:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:42.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:43.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:43 compute-2 ceph-mon[74913]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 09:57:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:44 compute-2 sudo[149007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:57:44 compute-2 sudo[149007]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:44 compute-2 sudo[149007]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:44.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:45.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:45 compute-2 ceph-mon[74913]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:57:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:57:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:57:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:46.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:47.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:47 compute-2 ceph-mon[74913]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:48.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:49.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:49 compute-2 ceph-mon[74913]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 09:57:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:50.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:51.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:51 compute-2 ceph-mon[74913]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:57:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:52.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:53 compute-2 sudo[149045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:57:53 compute-2 sudo[149045]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:57:53 compute-2 sudo[149045]: pam_unix(sudo:session): session closed for user root
Oct 10 09:57:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:53.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:53 compute-2 ceph-mon[74913]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:57:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:54.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:55.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:55 compute-2 ceph-mon[74913]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:57:55 compute-2 kernel: SELinux:  Converting 2770 SID table entries...
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:57:55 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:57:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:56.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:57 compute-2 ceph-mon[74913]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:57:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:57:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:57:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:58.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:57:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:57:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:57:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:57:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:57:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:57:59 compute-2 ceph-mon[74913]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:00.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:01.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:01 compute-2 ceph-mon[74913]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:58:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:02.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:03.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:03 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct 10 09:58:03 compute-2 podman[149088]: 2025-10-10 09:58:03.893691784 +0000 UTC m=+0.163753292 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 09:58:03 compute-2 ceph-mon[74913]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:58:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:04.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:58:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:05.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:06 compute-2 ceph-mon[74913]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:06 compute-2 kernel: SELinux:  Converting 2770 SID table entries...
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:58:06 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:58:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:06.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:07 compute-2 ceph-mon[74913]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:07.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:08.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:09.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:09 compute-2 ceph-mon[74913]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:10.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:11.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:11 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 10 09:58:11 compute-2 ceph-mon[74913]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:11 compute-2 podman[149131]: 2025-10-10 09:58:11.793999485 +0000 UTC m=+0.059415231 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 09:58:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:12.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:13 compute-2 sudo[149152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:58:13 compute-2 sudo[149152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:13 compute-2 sudo[149152]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:13.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:13 compute-2 ceph-mon[74913]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:14.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:15.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:15 compute-2 ceph-mon[74913]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:16.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:58:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:17.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:18 compute-2 ceph-mon[74913]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:18.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:19 compute-2 ceph-mon[74913]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:19.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:21.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:21 compute-2 ceph-mon[74913]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:22.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:23.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:23 compute-2 ceph-mon[74913]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:24.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:25.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:25 compute-2 ceph-mon[74913]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:26.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:27.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:27 compute-2 ceph-mon[74913]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:28.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:29 compute-2 ceph-mon[74913]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:29 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:30.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:31.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:31 compute-2 ceph-mon[74913]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:58:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:33 compute-2 sudo[159034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:58:33 compute-2 sudo[159034]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:33 compute-2 sudo[159034]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:33.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:33 compute-2 ceph-mon[74913]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:34 compute-2 podman[160060]: 2025-10-10 09:58:34.809646 +0000 UTC m=+0.084761653 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 10 09:58:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:34.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:34 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:35.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:36 compute-2 ceph-mon[74913]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:36.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:37 compute-2 ceph-mon[74913]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:37.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:38.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:39.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:39 compute-2 ceph-mon[74913]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:40.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 09:58:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.446 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 09:58:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.446 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 09:58:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:41.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:41 compute-2 ceph-mon[74913]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:42 compute-2 podman[165801]: 2025-10-10 09:58:42.782863567 +0000 UTC m=+0.056374844 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 09:58:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:42.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:43.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:43 compute-2 ceph-mon[74913]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:44 compute-2 sudo[166022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:58:44 compute-2 sudo[166022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:44 compute-2 sudo[166022]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:44.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:44 compute-2 sudo[166047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 09:58:44 compute-2 sudo[166047]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:45.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:45 compute-2 podman[166151]: 2025-10-10 09:58:45.626973247 +0000 UTC m=+0.134119068 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 09:58:45 compute-2 podman[166151]: 2025-10-10 09:58:45.73021688 +0000 UTC m=+0.237362701 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2)
Oct 10 09:58:45 compute-2 ceph-mon[74913]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 09:58:46 compute-2 podman[166275]: 2025-10-10 09:58:46.202618624 +0000 UTC m=+0.089980450 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:58:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:46 compute-2 podman[166304]: 2025-10-10 09:58:46.274015692 +0000 UTC m=+0.054300533 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:58:46 compute-2 podman[166275]: 2025-10-10 09:58:46.341597236 +0000 UTC m=+0.228959072 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 09:58:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:46 compute-2 podman[166373]: 2025-10-10 09:58:46.938204612 +0000 UTC m=+0.105552617 container exec d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True)
Oct 10 09:58:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:58:47 compute-2 podman[166394]: 2025-10-10 09:58:47.013045879 +0000 UTC m=+0.056126981 container exec_died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 09:58:47 compute-2 podman[166373]: 2025-10-10 09:58:47.039842833 +0000 UTC m=+0.207190808 container exec_died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Oct 10 09:58:47 compute-2 podman[166437]: 2025-10-10 09:58:47.467280024 +0000 UTC m=+0.221243046 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:58:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:47 compute-2 podman[166459]: 2025-10-10 09:58:47.628032441 +0000 UTC m=+0.053642612 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:58:47 compute-2 podman[166437]: 2025-10-10 09:58:47.68163559 +0000 UTC m=+0.435598592 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 09:58:47 compute-2 podman[166506]: 2025-10-10 09:58:47.92497495 +0000 UTC m=+0.050061407 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, release=1793, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, description=keepalived for Ceph, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Oct 10 09:58:48 compute-2 podman[166527]: 2025-10-10 09:58:48.052041293 +0000 UTC m=+0.050494471 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=2.2.4, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 09:58:48 compute-2 podman[166506]: 2025-10-10 09:58:48.063166348 +0000 UTC m=+0.188252785 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.expose-services=)
Oct 10 09:58:48 compute-2 ceph-mon[74913]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:48 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:48 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:48 compute-2 sudo[166047]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:48 compute-2 sudo[166576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 09:58:48 compute-2 sudo[166576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:48 compute-2 sudo[166576]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:48 compute-2 sudo[166602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 09:58:48 compute-2 sudo[166602]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:48.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:49 compute-2 sudo[166602]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:49.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 09:58:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:49 compute-2 ceph-mon[74913]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:58:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:50.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 09:58:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:58:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 09:58:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:52 compute-2 ceph-mon[74913]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:58:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 09:58:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 09:58:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 09:58:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:53 compute-2 sudo[166664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:58:53 compute-2 sudo[166664]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:58:53 compute-2 sudo[166664]: pam_unix(sudo:session): session closed for user root
Oct 10 09:58:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:53.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:54 compute-2 ceph-mon[74913]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:58:55 compute-2 ceph-mon[74913]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:55.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:56.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:58:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:57.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:58:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:58 compute-2 kernel: SELinux:  Converting 2771 SID table entries...
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability open_perms=1
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability always_check_network=0
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 09:58:58 compute-2 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 09:58:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:58:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 09:58:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:58.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 09:58:59 compute-2 ceph-mon[74913]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:58:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:58:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:58:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:59.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:58:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:58:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:00 compute-2 ceph-mon[74913]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:00 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:00 compute-2 groupadd[166710]: group added to /etc/group: name=dnsmasq, GID=992
Oct 10 09:59:00 compute-2 groupadd[166710]: group added to /etc/gshadow: name=dnsmasq
Oct 10 09:59:00 compute-2 groupadd[166710]: new group: name=dnsmasq, GID=992
Oct 10 09:59:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:00.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:01 compute-2 useradd[166717]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 10 09:59:01 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:59:01 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 10 09:59:01 compute-2 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 09:59:01 compute-2 ceph-mon[74913]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:01.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:01 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:02 compute-2 groupadd[166730]: group added to /etc/group: name=clevis, GID=991
Oct 10 09:59:02 compute-2 groupadd[166730]: group added to /etc/gshadow: name=clevis
Oct 10 09:59:02 compute-2 groupadd[166730]: new group: name=clevis, GID=991
Oct 10 09:59:02 compute-2 useradd[166737]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 10 09:59:02 compute-2 usermod[166747]: add 'clevis' to group 'tss'
Oct 10 09:59:02 compute-2 usermod[166747]: add 'clevis' to shadow group 'tss'
Oct 10 09:59:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:59:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:02 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:02 compute-2 sudo[166758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 09:59:02 compute-2 sudo[166758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:59:02 compute-2 sudo[166758]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:02.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:03.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:59:03 compute-2 ceph-mon[74913]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 09:59:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:03 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:04 compute-2 polkitd[7343]: Reloading rules
Oct 10 09:59:04 compute-2 polkitd[7343]: Collecting garbage unconditionally...
Oct 10 09:59:04 compute-2 polkitd[7343]: Loading rules from directory /etc/polkit-1/rules.d
Oct 10 09:59:04 compute-2 polkitd[7343]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 10 09:59:04 compute-2 polkitd[7343]: Finished loading, compiling and executing 4 rules
Oct 10 09:59:04 compute-2 polkitd[7343]: Reloading rules
Oct 10 09:59:04 compute-2 polkitd[7343]: Collecting garbage unconditionally...
Oct 10 09:59:04 compute-2 polkitd[7343]: Loading rules from directory /etc/polkit-1/rules.d
Oct 10 09:59:04 compute-2 polkitd[7343]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 10 09:59:04 compute-2 polkitd[7343]: Finished loading, compiling and executing 4 rules
Oct 10 09:59:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:04 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:04.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:05 compute-2 podman[166870]: 2025-10-10 09:59:05.046438183 +0000 UTC m=+0.080493058 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 09:59:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:05.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:05 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:05 compute-2 ceph-mon[74913]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:06 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:06 compute-2 groupadd[166990]: group added to /etc/group: name=ceph, GID=167
Oct 10 09:59:06 compute-2 groupadd[166990]: group added to /etc/gshadow: name=ceph
Oct 10 09:59:06 compute-2 groupadd[166990]: new group: name=ceph, GID=167
Oct 10 09:59:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:06.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:07 compute-2 useradd[166997]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 10 09:59:07 compute-2 ceph-mon[74913]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:07.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:07 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:08 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:59:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:08.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:59:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:09.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:09 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:09 compute-2 ceph-mon[74913]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:10 compute-2 systemd[1]: Stopping OpenSSH server daemon...
Oct 10 09:59:10 compute-2 sshd[1002]: Received signal 15; terminating.
Oct 10 09:59:10 compute-2 systemd[1]: sshd.service: Deactivated successfully.
Oct 10 09:59:10 compute-2 systemd[1]: Stopped OpenSSH server daemon.
Oct 10 09:59:10 compute-2 systemd[1]: sshd.service: Consumed 2.319s CPU time, read 0B from disk, written 4.0K to disk.
Oct 10 09:59:10 compute-2 systemd[1]: Stopped target sshd-keygen.target.
Oct 10 09:59:10 compute-2 systemd[1]: Stopping sshd-keygen.target...
Oct 10 09:59:10 compute-2 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:59:10 compute-2 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:59:10 compute-2 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 09:59:10 compute-2 systemd[1]: Reached target sshd-keygen.target.
Oct 10 09:59:10 compute-2 systemd[1]: Starting OpenSSH server daemon...
Oct 10 09:59:10 compute-2 sshd[167664]: Server listening on 0.0.0.0 port 22.
Oct 10 09:59:10 compute-2 sshd[167664]: Server listening on :: port 22.
Oct 10 09:59:10 compute-2 systemd[1]: Started OpenSSH server daemon.
Oct 10 09:59:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:10 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:10.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:11.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:11 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:11 compute-2 ceph-mon[74913]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:11 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 09:59:11 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 09:59:12 compute-2 systemd[1]: Reloading.
Oct 10 09:59:12 compute-2 systemd-sysv-generator[167927]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:12 compute-2 systemd-rc-local-generator[167924]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:12 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 09:59:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:12 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:13.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:13 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:13 compute-2 sudo[169492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:59:13 compute-2 sudo[169492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:59:13 compute-2 sudo[169492]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:13 compute-2 podman[169588]: 2025-10-10 09:59:13.688821789 +0000 UTC m=+0.079842277 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 09:59:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:14 compute-2 ceph-mon[74913]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:14 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:15 compute-2 ceph-mon[74913]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:15.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:15 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:59:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:16 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:16 compute-2 systemd[1]: Starting PackageKit Daemon...
Oct 10 09:59:16 compute-2 PackageKit[172997]: daemon start
Oct 10 09:59:16 compute-2 systemd[1]: Started PackageKit Daemon.
Oct 10 09:59:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:16.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:17 compute-2 sudo[148632]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:17.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:17 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:17 compute-2 ceph-mon[74913]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:18 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:18.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:19.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:19 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:19 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 09:59:19 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 09:59:19 compute-2 systemd[1]: man-db-cache-update.service: Consumed 9.823s CPU time.
Oct 10 09:59:19 compute-2 systemd[1]: run-r20f4e3d58f8d41ba9f126ba5b28f19ed.service: Deactivated successfully.
Oct 10 09:59:19 compute-2 ceph-mon[74913]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:20 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:20.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:21.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:21 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:22 compute-2 ceph-mon[74913]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:22 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:22.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:23 compute-2 ceph-mon[74913]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:23 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:23.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:24 compute-2 sudo[176377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgboijtugkclbwgocqmsnrpisrihnwpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090363.6246703-971-77481124047382/AnsiballZ_systemd.py'
Oct 10 09:59:24 compute-2 sudo[176377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:24 compute-2 python3.9[176379]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:59:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:24 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:24 compute-2 systemd[1]: Reloading.
Oct 10 09:59:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:24 compute-2 systemd-rc-local-generator[176408]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:24 compute-2 systemd-sysv-generator[176412]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:24 compute-2 sudo[176377]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:24.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:25 compute-2 sudo[176568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkbhdwvkcywxwoniqxdmtjkvfslfowco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090365.023042-971-80329595232455/AnsiballZ_systemd.py'
Oct 10 09:59:25 compute-2 sudo[176568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:25 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:25.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:25 compute-2 ceph-mon[74913]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:25 compute-2 python3.9[176570]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:59:25 compute-2 systemd[1]: Reloading.
Oct 10 09:59:25 compute-2 systemd-sysv-generator[176604]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:25 compute-2 systemd-rc-local-generator[176600]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:26 compute-2 sudo[176568]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:26 compute-2 sudo[176758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyufdyctxzpsldxskxhlcxjvrdeijaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090366.2726245-971-18714913565640/AnsiballZ_systemd.py'
Oct 10 09:59:26 compute-2 sudo[176758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:26 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:26 compute-2 python3.9[176760]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:59:26 compute-2 systemd[1]: Reloading.
Oct 10 09:59:26 compute-2 systemd-rc-local-generator[176789]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:26 compute-2 systemd-sysv-generator[176792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:26.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:27 compute-2 sudo[176758]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:27 compute-2 sudo[176950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivglqwtykjqkejdijtdpsucpuhwibiqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090367.2569504-971-127954508918408/AnsiballZ_systemd.py'
Oct 10 09:59:27 compute-2 sudo[176950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:27 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:59:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:27.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:59:27 compute-2 ceph-mon[74913]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:27 compute-2 python3.9[176952]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:59:27 compute-2 systemd[1]: Reloading.
Oct 10 09:59:27 compute-2 systemd-rc-local-generator[176983]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:27 compute-2 systemd-sysv-generator[176987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:28 compute-2 sudo[176950]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:28 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:28.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:29 compute-2 sudo[177142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfclzaphcproibnezxkzwklfpeudpjop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090368.8851075-1058-57081334000651/AnsiballZ_systemd.py'
Oct 10 09:59:29 compute-2 sudo[177142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:29 compute-2 python3.9[177144]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:29 compute-2 systemd[1]: Reloading.
Oct 10 09:59:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:29 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:29.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:29 compute-2 systemd-rc-local-generator[177174]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:29 compute-2 systemd-sysv-generator[177179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:29 compute-2 ceph-mon[74913]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:29 compute-2 sudo[177142]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:30 compute-2 sudo[177332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjmztffmbzasqdtspjlwmtmuaokkzhjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090369.9708838-1058-57225376112415/AnsiballZ_systemd.py'
Oct 10 09:59:30 compute-2 sudo[177332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:30 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:30 compute-2 python3.9[177334]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:30 compute-2 systemd[1]: Reloading.
Oct 10 09:59:30 compute-2 systemd-rc-local-generator[177366]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:30 compute-2 systemd-sysv-generator[177370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:30.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:31 compute-2 sudo[177332]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:31 compute-2 sudo[177524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nphjbddxibbbwlzecbwdzkdknlhiccnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090371.1675272-1058-68047019504927/AnsiballZ_systemd.py'
Oct 10 09:59:31 compute-2 sudo[177524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:31 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:31.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:31 compute-2 python3.9[177526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:31 compute-2 ceph-mon[74913]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:59:31 compute-2 systemd[1]: Reloading.
Oct 10 09:59:31 compute-2 systemd-rc-local-generator[177555]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:31 compute-2 systemd-sysv-generator[177560]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:32 compute-2 sudo[177524]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:32 compute-2 sudo[177714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hywxsusnmfuoxnyucdfjabzdplouazod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090372.2158415-1058-122178472475328/AnsiballZ_systemd.py'
Oct 10 09:59:32 compute-2 sudo[177714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:32 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:32 compute-2 python3.9[177717]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:32 compute-2 sudo[177714]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:32.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:33 compute-2 sudo[177871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfferhfechlhrbxeiyhzbnjkvollfjrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090373.0917516-1058-198349528546071/AnsiballZ_systemd.py'
Oct 10 09:59:33 compute-2 sudo[177871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:33 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:33.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:33 compute-2 sudo[177874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:59:33 compute-2 sudo[177874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:59:33 compute-2 sudo[177874]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:33 compute-2 ceph-mon[74913]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:33 compute-2 python3.9[177873]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:33 compute-2 systemd[1]: Reloading.
Oct 10 09:59:34 compute-2 systemd-sysv-generator[177930]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:34 compute-2 systemd-rc-local-generator[177926]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:34 compute-2 sudo[177871]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:34 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:34.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:35 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:35.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:35 compute-2 ceph-mon[74913]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:35 compute-2 podman[177963]: 2025-10-10 09:59:35.884153317 +0000 UTC m=+0.144655024 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct 10 09:59:36 compute-2 sudo[178114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipqmkkbhtnqufkwutbbccdvczjdwtbxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090375.9623444-1165-143385399917090/AnsiballZ_systemd.py'
Oct 10 09:59:36 compute-2 sudo[178114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:36 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:36 compute-2 python3.9[178116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 09:59:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:36 compute-2 systemd[1]: Reloading.
Oct 10 09:59:36 compute-2 systemd-rc-local-generator[178148]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 09:59:36 compute-2 systemd-sysv-generator[178151]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 09:59:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:36.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:37 compute-2 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 10 09:59:37 compute-2 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 10 09:59:37 compute-2 sudo[178114]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:37 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:37 compute-2 ceph-mon[74913]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:38 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:38 compute-2 sudo[178312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdnimpjszxxlvepntbmeejcfzetzmlmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090378.4033067-1189-130946767116843/AnsiballZ_systemd.py'
Oct 10 09:59:38 compute-2 sudo[178312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:38.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:39 compute-2 python3.9[178314]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:39 compute-2 sudo[178312]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:39 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:39.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:39 compute-2 sudo[178468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhvvuaprtfrdaxqwmkmxlbpdackmccow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090379.313897-1189-88316971338991/AnsiballZ_systemd.py'
Oct 10 09:59:39 compute-2 sudo[178468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:39 compute-2 ceph-mon[74913]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:39 compute-2 python3.9[178470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:39 compute-2 sudo[178468]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:40 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:40 compute-2 sudo[178624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltsfrzhpnafyctjnjkyzpfvxspbqdisb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090380.0722668-1189-69355453769198/AnsiballZ_systemd.py'
Oct 10 09:59:40 compute-2 sudo[178624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:40 compute-2 python3.9[178626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:40 compute-2 sudo[178624]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:40.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:41 compute-2 sudo[178780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urenbqtilfztaxhilcctaznomrplbiap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090381.110761-1189-43949941883969/AnsiballZ_systemd.py'
Oct 10 09:59:41 compute-2 sudo[178780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 09:59:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 09:59:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 09:59:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:41 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:59:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:41.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:59:41 compute-2 python3.9[178782]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:41 compute-2 sudo[178780]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:41 compute-2 ceph-mon[74913]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:42 compute-2 sudo[178935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuiqttxnjzjbxutybqlqxqntxpzaladv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090381.8863814-1189-230589345100690/AnsiballZ_systemd.py'
Oct 10 09:59:42 compute-2 sudo[178935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:42 compute-2 python3.9[178937]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:42 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:42 compute-2 sudo[178935]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:42.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:43 compute-2 sudo[179092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qftiswpgfweqhrzivfvmdzghyifusgeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090382.7258682-1189-82019268643682/AnsiballZ_systemd.py'
Oct 10 09:59:43 compute-2 sudo[179092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:43 compute-2 python3.9[179094]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:43 compute-2 sudo[179092]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:43 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:43 compute-2 podman[179198]: 2025-10-10 09:59:43.792012188 +0000 UTC m=+0.070701975 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 09:59:43 compute-2 sudo[179266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-empbpjyfymhfgeidbodflbewxquhyhvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090383.512984-1189-220823251929792/AnsiballZ_systemd.py'
Oct 10 09:59:43 compute-2 sudo[179266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:43 compute-2 ceph-mon[74913]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:44 compute-2 python3.9[179268]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:44 compute-2 sudo[179266]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:44 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:44 compute-2 sudo[179423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qawpabxjkaqxapebnosxkrncjbyipefu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090384.3991902-1189-216859929709795/AnsiballZ_systemd.py'
Oct 10 09:59:44 compute-2 sudo[179423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:45.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:45 compute-2 python3.9[179425]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:45 compute-2 sudo[179423]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:45 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:45.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:45 compute-2 sudo[179578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcqgwejhawtnxqjwerwonwjcdtlsctzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090385.4264064-1189-271816384949702/AnsiballZ_systemd.py'
Oct 10 09:59:45 compute-2 sudo[179578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:45 compute-2 ceph-mon[74913]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:45 compute-2 python3.9[179580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:46 compute-2 sudo[179578]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:46 compute-2 sudo[179733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btnwfzcktwgbuwemlfhvwodjqkizkgcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090386.1902177-1189-91514736054452/AnsiballZ_systemd.py'
Oct 10 09:59:46 compute-2 sudo[179733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:46 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:46 compute-2 python3.9[179735]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:46 compute-2 sudo[179733]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 09:59:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:47.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:47 compute-2 sudo[179890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evtwvfsjmqpcnwagibnfwetbdvpzzlje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090386.9338825-1189-198573712480529/AnsiballZ_systemd.py'
Oct 10 09:59:47 compute-2 sudo[179890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:47 compute-2 python3.9[179892]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:47 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:47.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:47 compute-2 sudo[179890]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:47 compute-2 ceph-mon[74913]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:47 compute-2 sudo[180045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blkwplltmwaohsbvgypsgovslfdozaog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090387.7038164-1189-225282490296249/AnsiballZ_systemd.py'
Oct 10 09:59:47 compute-2 sudo[180045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:48 compute-2 python3.9[180047]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:48 compute-2 sudo[180045]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:48 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:48 compute-2 sudo[180201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidnxjpjmbzrclnpjbwujyjksikcadbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090388.4851012-1189-1329078359959/AnsiballZ_systemd.py'
Oct 10 09:59:48 compute-2 sudo[180201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:59:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:49.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:59:49 compute-2 python3.9[180204]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:49 compute-2 sudo[180201]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:49 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:49.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:49 compute-2 sudo[180357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjbnqqkfagrfkfrkfoifbrvgvgwhodjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090389.3248432-1189-101100121261398/AnsiballZ_systemd.py'
Oct 10 09:59:49 compute-2 sudo[180357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:49 compute-2 python3.9[180359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 09:59:49 compute-2 ceph-mon[74913]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 09:59:50 compute-2 sudo[180357]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:50 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:51 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 09:59:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:51.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 09:59:51 compute-2 ceph-mon[74913]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:52 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:52 compute-2 sudo[180515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kndtjehqdbkivddqzjxrfndplkdyvcfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090392.510787-1496-146063922274253/AnsiballZ_file.py'
Oct 10 09:59:52 compute-2 sudo[180515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:52 compute-2 python3.9[180518]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:52 compute-2 sudo[180515]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:53.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:53 compute-2 sudo[180668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvtygqvfhgcgvhlrqzrnrdszjsntygou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090393.144077-1496-221702677085835/AnsiballZ_file.py'
Oct 10 09:59:53 compute-2 sudo[180668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:53 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:53.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:53 compute-2 python3.9[180670]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:53 compute-2 sudo[180668]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:53 compute-2 sudo[180695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 09:59:53 compute-2 sudo[180695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 09:59:53 compute-2 sudo[180695]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:54 compute-2 ceph-mon[74913]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:54 compute-2 sudo[180845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwhwngooeuqrzczmgapveygkhdffahnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090393.7929199-1496-249916597785150/AnsiballZ_file.py'
Oct 10 09:59:54 compute-2 sudo[180845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:54 compute-2 python3.9[180847]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:54 compute-2 sudo[180845]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:54 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:54 compute-2 sudo[180998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvhgrmgvtdtwidkrvlasslwnessftsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090394.4559367-1496-265212717705332/AnsiballZ_file.py'
Oct 10 09:59:54 compute-2 sudo[180998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:54 compute-2 python3.9[181000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:54 compute-2 sudo[180998]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:55.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 09:59:55 compute-2 sudo[181151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tucmimxxkvvgunzfgfatekzmtmtvzcfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090395.0823276-1496-258190975352613/AnsiballZ_file.py'
Oct 10 09:59:55 compute-2 sudo[181151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:55 compute-2 python3.9[181153]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:55 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:55 compute-2 sudo[181151]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 09:59:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:55.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 09:59:55 compute-2 sudo[181303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buoicuegojpbxkumucqzggrqzwhuvjvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090395.678974-1496-262770718099012/AnsiballZ_file.py'
Oct 10 09:59:55 compute-2 sudo[181303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:56 compute-2 ceph-mon[74913]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:56 compute-2 python3.9[181305]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 09:59:56 compute-2 sudo[181303]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:56 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:57 compute-2 sudo[181457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmnsryazwxashijljdfgolxfmcnopqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090396.7745173-1625-79440985962375/AnsiballZ_stat.py'
Oct 10 09:59:57 compute-2 sudo[181457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:57 compute-2 python3.9[181459]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:59:57 compute-2 sudo[181457]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:57 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:57.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:57 compute-2 sudo[181582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwenzrzdkemhdfnbsibhedtkhghjlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090396.7745173-1625-79440985962375/AnsiballZ_copy.py'
Oct 10 09:59:57 compute-2 sudo[181582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:58 compute-2 ceph-mon[74913]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 09:59:58 compute-2 python3.9[181584]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090396.7745173-1625-79440985962375/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:59:58 compute-2 sudo[181582]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e80013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:58 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:58 compute-2 sudo[181735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niyekwtjoxcqpwsslokunwhromareqaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090398.3343463-1625-5944362674543/AnsiballZ_stat.py'
Oct 10 09:59:58 compute-2 sudo[181735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 09:59:58 compute-2 python3.9[181737]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 09:59:58 compute-2 sudo[181735]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:59 compute-2 sudo[181861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppbuwtohcsoqcbyfpjamotqcabchurzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090398.3343463-1625-5944362674543/AnsiballZ_copy.py'
Oct 10 09:59:59 compute-2 sudo[181861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:59 compute-2 python3.9[181863]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090398.3343463-1625-5944362674543/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 09:59:59 compute-2 sudo[181861]: pam_unix(sudo:session): session closed for user root
Oct 10 09:59:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:59 2025: (VI_0) received an invalid passwd!
Oct 10 09:59:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 09:59:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 09:59:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:59.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 09:59:59 compute-2 sudo[182013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxpxvzuxqzpryjchtaomigghqjjecwwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090399.5006964-1625-279787395054929/AnsiballZ_stat.py'
Oct 10 09:59:59 compute-2 sudo[182013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 09:59:59 compute-2 python3.9[182015]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:00 compute-2 sudo[182013]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:00 compute-2 ceph-mon[74913]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:00:00 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 10:00:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:00 compute-2 sudo[182138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpskvipmhkrdivmlfeawtzvzczijxjku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090399.5006964-1625-279787395054929/AnsiballZ_copy.py'
Oct 10 10:00:00 compute-2 sudo[182138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:00 compute-2 python3.9[182140]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090399.5006964-1625-279787395054929/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:00 compute-2 sudo[182138]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:00 compute-2 sudo[182292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxxzkdvragfbrgdqovpiqjcldoylwsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090400.7160127-1625-11903019279587/AnsiballZ_stat.py'
Oct 10 10:00:00 compute-2 sudo[182292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:01.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:01 compute-2 ceph-mon[74913]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:01 compute-2 python3.9[182294]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:01 compute-2 sudo[182292]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:01 compute-2 sudo[182417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkwzijqjxshmdxmuxwycieugxeilkvtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090400.7160127-1625-11903019279587/AnsiballZ_copy.py'
Oct 10 10:00:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:01 compute-2 sudo[182417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:00:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:01.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:00:01 compute-2 python3.9[182419]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090400.7160127-1625-11903019279587/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:01 compute-2 sudo[182417]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:00:02 compute-2 sudo[182569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfwfzzcafadnuqdfiwhevaqqwzzwixii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090401.8734188-1625-51238216520086/AnsiballZ_stat.py'
Oct 10 10:00:02 compute-2 sudo[182569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:02 compute-2 python3.9[182571]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:02 compute-2 sudo[182569]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:02 compute-2 sudo[182695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsldcfjlphucotbxhhrjanqdrdqbaxtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090401.8734188-1625-51238216520086/AnsiballZ_copy.py'
Oct 10 10:00:02 compute-2 sudo[182695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:02 compute-2 sudo[182699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:00:02 compute-2 sudo[182699]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:02 compute-2 sudo[182699]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:02 compute-2 python3.9[182697]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090401.8734188-1625-51238216520086/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:02 compute-2 sudo[182695]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:02 compute-2 sudo[182724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:00:02 compute-2 sudo[182724]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:03.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:03 compute-2 ceph-mon[74913]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:03 compute-2 sudo[182916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfveapdgitbglqcfmkcagdqifholgdgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090403.088456-1625-140997741223943/AnsiballZ_stat.py'
Oct 10 10:00:03 compute-2 sudo[182916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:03 compute-2 sudo[182724]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:03 compute-2 python3.9[182920]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:03 compute-2 sudo[182916]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:03.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:03 compute-2 sudo[183055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwzphupmgnjrourdjxntmerhzdbjlmbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090403.088456-1625-140997741223943/AnsiballZ_copy.py'
Oct 10 10:00:03 compute-2 sudo[183055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:04 compute-2 python3.9[183057]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090403.088456-1625-140997741223943/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:04 compute-2 sudo[183055]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:04 compute-2 sudo[183208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdnosdnzyudwcjijvmpmaijuooygjjvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090404.2670188-1625-245643392618880/AnsiballZ_stat.py'
Oct 10 10:00:04 compute-2 sudo[183208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:04 compute-2 python3.9[183210]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:04 compute-2 sudo[183208]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:05.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:05 compute-2 sudo[183332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooqcurvshmdphawlyojaybjxdgbbloxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090404.2670188-1625-245643392618880/AnsiballZ_copy.py'
Oct 10 10:00:05 compute-2 sudo[183332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:05 compute-2 python3.9[183334]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090404.2670188-1625-245643392618880/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:05 compute-2 sudo[183332]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:05.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:05 compute-2 ceph-mon[74913]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:05 compute-2 sudo[183484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgciumxepglrpevfmwoalckmllitruda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090405.450699-1625-62421646790118/AnsiballZ_stat.py'
Oct 10 10:00:05 compute-2 sudo[183484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:05 compute-2 python3.9[183486]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:05 compute-2 sudo[183484]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:06 compute-2 sudo[183624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotywmsnuqlhnewhgeaclwqjdtzixuql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090405.450699-1625-62421646790118/AnsiballZ_copy.py'
Oct 10 10:00:06 compute-2 sudo[183624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:06 compute-2 podman[183583]: 2025-10-10 10:00:06.248266746 +0000 UTC m=+0.085137217 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 10 10:00:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:06 compute-2 python3.9[183630]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090405.450699-1625-62421646790118/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:06 compute-2 sudo[183624]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:07.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:07 compute-2 ceph-mon[74913]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:00:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:00:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4004430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:08 compute-2 sudo[183789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkexlxpxhygooicgzydafbwkpribajwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090408.260992-1964-193214725176811/AnsiballZ_command.py'
Oct 10 10:00:08 compute-2 sudo[183789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:08 compute-2 python3.9[183792]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 10 10:00:08 compute-2 sudo[183789]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:09.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:09 compute-2 sudo[183944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wumspzicttxwiydjdbwlfbiieihhbgey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090409.1842809-1990-240426619226745/AnsiballZ_file.py'
Oct 10 10:00:09 compute-2 sudo[183944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:09.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:09 compute-2 ceph-mon[74913]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:00:09 compute-2 python3.9[183946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:09 compute-2 sudo[183944]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:10 compute-2 sudo[184096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvfszlnyvhkcdamsfxtpcaojltypcyoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090409.769024-1990-102858482488402/AnsiballZ_file.py'
Oct 10 10:00:10 compute-2 sudo[184096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:10 compute-2 python3.9[184098]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:10 compute-2 sudo[184096]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:10 compute-2 sudo[184251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ombjaeqyuvdfajwzwzszkxbzmwnkkmos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090410.3818238-1990-163641120559596/AnsiballZ_file.py'
Oct 10 10:00:10 compute-2 sudo[184251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:10 compute-2 python3.9[184253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:10 compute-2 sudo[184251]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:11.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:11 compute-2 sudo[184404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgqdvclinmlhbaeqbzahdvqnrqzhgnpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090411.0013196-1990-174724464910505/AnsiballZ_file.py'
Oct 10 10:00:11 compute-2 sudo[184404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:11 compute-2 python3.9[184406]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:11 compute-2 sudo[184404]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:11.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:11 compute-2 ceph-mon[74913]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:11 compute-2 sudo[184556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djlpcltnwlglmslwrsmfjlkqqcriuayh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090411.606472-1990-204028835894539/AnsiballZ_file.py'
Oct 10 10:00:11 compute-2 sudo[184556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:12 compute-2 python3.9[184558]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:12 compute-2 sudo[184556]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:12 compute-2 sudo[184708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtnwdxukktzxwpeclircyjebrpnpdana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090412.1690712-1990-273149024645682/AnsiballZ_file.py'
Oct 10 10:00:12 compute-2 sudo[184708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:12 compute-2 python3.9[184710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:12 compute-2 sudo[184708]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:12 compute-2 sudo[184730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:00:12 compute-2 sudo[184730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:12 compute-2 sudo[184730]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:12 compute-2 sudo[184887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvajapgyljpwzlyklfkrceuuwwxugarg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090412.7486124-1990-201482863751532/AnsiballZ_file.py'
Oct 10 10:00:12 compute-2 sudo[184887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:13.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:13 compute-2 python3.9[184889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:13 compute-2 sudo[184887]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:00:13 compute-2 ceph-mon[74913]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:13 compute-2 sudo[185039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arkjynlnpklgpgruafaizwgucjxgvmpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090413.3270764-1990-86591985663340/AnsiballZ_file.py'
Oct 10 10:00:13 compute-2 sudo[185039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:13.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:13 compute-2 python3.9[185041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:13 compute-2 sudo[185039]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:13 compute-2 sudo[185043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:00:13 compute-2 sudo[185043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:13 compute-2 sudo[185043]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:13 compute-2 podman[185090]: 2025-10-10 10:00:13.949542119 +0000 UTC m=+0.062461454 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:00:14 compute-2 sudo[185233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjqlxsjemndtygijqsejnjkbsfcdfxtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090413.896106-1990-192982934933917/AnsiballZ_file.py'
Oct 10 10:00:14 compute-2 sudo[185233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:14 compute-2 python3.9[185235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908002ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:14 compute-2 sudo[185233]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:14 compute-2 kernel: ganesha.nfsd[165476]: segfault at 50 ip 00007fb9bb29732e sp 00007fb9897f9210 error 4 in libntirpc.so.5.8[7fb9bb27c000+2c000] likely on CPU 0 (core 0, socket 0)
Oct 10 10:00:14 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:00:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy ignored for local
Oct 10 10:00:14 compute-2 systemd[1]: Started Process Core Dump (PID 185357/UID 0).
Oct 10 10:00:14 compute-2 sudo[185388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrlqqwpsadojvieypbjjptjgamtasijj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090414.4886575-1990-201167336962523/AnsiballZ_file.py'
Oct 10 10:00:14 compute-2 sudo[185388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:14 compute-2 python3.9[185390]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:15 compute-2 sudo[185388]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:15.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:15 compute-2 sudo[185541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ataysfgdzwnmmbsplxewtripdnktpqkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090415.168434-1990-218792398916055/AnsiballZ_file.py'
Oct 10 10:00:15 compute-2 sudo[185541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:15 compute-2 ceph-mon[74913]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:15.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:15 compute-2 python3.9[185543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:15 compute-2 sudo[185541]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:15 compute-2 systemd-coredump[185361]: Process 148191 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007fb9bb29732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:00:15 compute-2 systemd[1]: systemd-coredump@5-185357-0.service: Deactivated successfully.
Oct 10 10:00:15 compute-2 systemd[1]: systemd-coredump@5-185357-0.service: Consumed 1.149s CPU time.
Oct 10 10:00:15 compute-2 podman[185608]: 2025-10-10 10:00:15.94643285 +0000 UTC m=+0.024743380 container died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:00:15 compute-2 systemd[1]: var-lib-containers-storage-overlay-03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd-merged.mount: Deactivated successfully.
Oct 10 10:00:15 compute-2 podman[185608]: 2025-10-10 10:00:15.982558162 +0000 UTC m=+0.060868692 container remove d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True)
Oct 10 10:00:15 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:00:16 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:00:16 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.661s CPU time.
Oct 10 10:00:16 compute-2 sudo[185740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iseuszdxoiisfnvhnuncdppoprnzmwjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090415.87808-1990-191069129092843/AnsiballZ_file.py'
Oct 10 10:00:16 compute-2 sudo[185740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:16 compute-2 python3.9[185742]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:16 compute-2 sudo[185740]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:00:16 compute-2 sudo[185894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bptrbjcyuyjiinkdjnnndthexmlnbtpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090416.610918-1990-187837850375297/AnsiballZ_file.py'
Oct 10 10:00:16 compute-2 sudo[185894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:17 compute-2 python3.9[185896]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:17 compute-2 sudo[185894]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:17 compute-2 sudo[186046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yycspqeosmncsvetiigvwdyystyjymag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090417.1683455-1990-126374950200822/AnsiballZ_file.py'
Oct 10 10:00:17 compute-2 sudo[186046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:17 compute-2 python3.9[186048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:17.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:17 compute-2 sudo[186046]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:17 compute-2 ceph-mon[74913]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:00:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:19.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:19.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:19 compute-2 ceph-mon[74913]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:00:20 compute-2 sudo[186200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpvxawfcymbcvftfgxuaoibucqsfltjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090419.8201063-2288-62071139539853/AnsiballZ_stat.py'
Oct 10 10:00:20 compute-2 sudo[186200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:20 compute-2 python3.9[186202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:20 compute-2 sudo[186200]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:20 compute-2 sudo[186324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viiqgevvrhleqajudluypogzkxkqrlqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090419.8201063-2288-62071139539853/AnsiballZ_copy.py'
Oct 10 10:00:20 compute-2 sudo[186324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100020 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:00:20 compute-2 python3.9[186326]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090419.8201063-2288-62071139539853/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:20 compute-2 sudo[186324]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:21.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:21 compute-2 sudo[186477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdcojfwgccdrjwgwtpblvblmicfvpnvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090420.9733894-2288-54614717059651/AnsiballZ_stat.py'
Oct 10 10:00:21 compute-2 sudo[186477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:21 compute-2 python3.9[186479]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:21 compute-2 sudo[186477]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:21.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:21 compute-2 ceph-mon[74913]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:00:21 compute-2 sudo[186600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlwfvzxgcvebgdjdkagqxzpvdegunyra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090420.9733894-2288-54614717059651/AnsiballZ_copy.py'
Oct 10 10:00:21 compute-2 sudo[186600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:21 compute-2 python3.9[186602]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090420.9733894-2288-54614717059651/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:21 compute-2 sudo[186600]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:22 compute-2 sudo[186752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkfcrvenqdqdxmdfrbbtjmcvdsbquhma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090422.1163347-2288-105092995202982/AnsiballZ_stat.py'
Oct 10 10:00:22 compute-2 sudo[186752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:22 compute-2 python3.9[186754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:22 compute-2 sudo[186752]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:22 compute-2 sudo[186877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjhzfjsucsuxdjdaqvpdaferhremxsck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090422.1163347-2288-105092995202982/AnsiballZ_copy.py'
Oct 10 10:00:22 compute-2 sudo[186877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:23.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:23 compute-2 python3.9[186879]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090422.1163347-2288-105092995202982/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:23 compute-2 sudo[186877]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:23 compute-2 sudo[187029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthpiuwekzwxpgxlshzoxojeonuaghtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090423.3248363-2288-109096476681839/AnsiballZ_stat.py'
Oct 10 10:00:23 compute-2 sudo[187029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:23.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:23 compute-2 ceph-mon[74913]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:00:23 compute-2 python3.9[187031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:23 compute-2 sudo[187029]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:24 compute-2 sudo[187152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auqhursafjicqkrfkghgxpdmsfxrkzmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090423.3248363-2288-109096476681839/AnsiballZ_copy.py'
Oct 10 10:00:24 compute-2 sudo[187152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:24 compute-2 python3.9[187154]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090423.3248363-2288-109096476681839/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:24 compute-2 sudo[187152]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:24 compute-2 sudo[187305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqkcznotzouitsmwlmgwgbsmuauevpwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090424.4402773-2288-46803250057139/AnsiballZ_stat.py'
Oct 10 10:00:24 compute-2 sudo[187305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:24 compute-2 python3.9[187307]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:24 compute-2 sudo[187305]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:25.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:25 compute-2 sudo[187429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amvnqezzxsshkximtijqnkkpdhgcisly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090424.4402773-2288-46803250057139/AnsiballZ_copy.py'
Oct 10 10:00:25 compute-2 sudo[187429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:25 compute-2 python3.9[187431]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090424.4402773-2288-46803250057139/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:25 compute-2 sudo[187429]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:25.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:25 compute-2 ceph-mon[74913]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:00:25 compute-2 sudo[187581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htsxbyhsidzhkkyclosmjgbdkmambkuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090425.6616993-2288-162260178341038/AnsiballZ_stat.py'
Oct 10 10:00:25 compute-2 sudo[187581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:26 compute-2 python3.9[187583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:26 compute-2 sudo[187581]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:26 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 6.
Oct 10 10:00:26 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:00:26 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.661s CPU time.
Oct 10 10:00:26 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:00:26 compute-2 podman[187699]: 2025-10-10 10:00:26.402294566 +0000 UTC m=+0.036029410 container create d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 10:00:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:00:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:00:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:00:26 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:00:26 compute-2 podman[187699]: 2025-10-10 10:00:26.459935314 +0000 UTC m=+0.093670178 container init d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:00:26 compute-2 podman[187699]: 2025-10-10 10:00:26.465516522 +0000 UTC m=+0.099251366 container start d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid)
Oct 10 10:00:26 compute-2 bash[187699]: d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628
Oct 10 10:00:26 compute-2 podman[187699]: 2025-10-10 10:00:26.387768142 +0000 UTC m=+0.021502986 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:00:26 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:00:26 compute-2 sudo[187768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfqiyxdqfurjifqzhwzcmckktpndckyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090425.6616993-2288-162260178341038/AnsiballZ_copy.py'
Oct 10 10:00:26 compute-2 sudo[187768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:00:26 compute-2 python3.9[187771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090425.6616993-2288-162260178341038/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:26 compute-2 sudo[187768]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:27.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:27 compute-2 sudo[187960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swfrnfumcirlywmukcrkdumnycezehec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090426.8202562-2288-84309736547384/AnsiballZ_stat.py'
Oct 10 10:00:27 compute-2 sudo[187960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:27 compute-2 python3.9[187962]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:27 compute-2 sudo[187960]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:27 compute-2 sudo[188083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pocyriprbfihhugyhwftuamuktupsbsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090426.8202562-2288-84309736547384/AnsiballZ_copy.py'
Oct 10 10:00:27 compute-2 sudo[188083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:27.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:27 compute-2 ceph-mon[74913]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:00:27 compute-2 python3.9[188085]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090426.8202562-2288-84309736547384/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:27 compute-2 sudo[188083]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:28 compute-2 sudo[188235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmhoxasfnjdydavvmtoiibuzyrlsgoaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090427.9268405-2288-154365331841330/AnsiballZ_stat.py'
Oct 10 10:00:28 compute-2 sudo[188235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:28 compute-2 python3.9[188237]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:28 compute-2 sudo[188235]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:28 compute-2 sudo[188359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meffncmzywcksuykzptzwgkuzbfukgnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090427.9268405-2288-154365331841330/AnsiballZ_copy.py'
Oct 10 10:00:28 compute-2 sudo[188359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:28 compute-2 python3.9[188361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090427.9268405-2288-154365331841330/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:28 compute-2 sudo[188359]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:29.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:29 compute-2 sudo[188512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pboqdceqybnhseqzwzakoldqcowgeqyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090429.0868628-2288-88523671889339/AnsiballZ_stat.py'
Oct 10 10:00:29 compute-2 sudo[188512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:29 compute-2 python3.9[188514]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:29 compute-2 sudo[188512]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:00:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:29.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:00:29 compute-2 ceph-mon[74913]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:00:29 compute-2 sudo[188635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfqfzwvyviiszkneqnuqxxaihjkfxpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090429.0868628-2288-88523671889339/AnsiballZ_copy.py'
Oct 10 10:00:29 compute-2 sudo[188635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:30 compute-2 python3.9[188637]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090429.0868628-2288-88523671889339/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:30 compute-2 sudo[188635]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:30 compute-2 sudo[188787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijtsynzvedaobpotzfbrolrwxjthavcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090430.273673-2288-89285000421732/AnsiballZ_stat.py'
Oct 10 10:00:30 compute-2 sudo[188787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:30 compute-2 python3.9[188790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:30 compute-2 sudo[188787]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:31 compute-2 sudo[188912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tguohzppguddxjilvaarwuqvsigijloi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090430.273673-2288-89285000421732/AnsiballZ_copy.py'
Oct 10 10:00:31 compute-2 sudo[188912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:31 compute-2 python3.9[188914]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090430.273673-2288-89285000421732/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:31 compute-2 sudo[188912]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:31.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:31 compute-2 sudo[189064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnpaqygndilsquqywfadxwbfqigagyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090431.3919141-2288-9440498015569/AnsiballZ_stat.py'
Oct 10 10:00:31 compute-2 sudo[189064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:31 compute-2 ceph-mon[74913]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:00:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:00:31 compute-2 python3.9[189066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:31 compute-2 sudo[189064]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:32 compute-2 sudo[189187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywgqxgxzoiymehmqbucrarmmqiylncoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090431.3919141-2288-9440498015569/AnsiballZ_copy.py'
Oct 10 10:00:32 compute-2 sudo[189187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100032 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:00:32 compute-2 python3.9[189189]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090431.3919141-2288-9440498015569/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:32 compute-2 sudo[189187]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:00:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:00:32 compute-2 sudo[189340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onmyguixbhjwtabfkzuutyrbycfmftjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090432.4962604-2288-168625719786527/AnsiballZ_stat.py'
Oct 10 10:00:32 compute-2 sudo[189340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:32 compute-2 python3.9[189342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:32 compute-2 sudo[189340]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:33 compute-2 sudo[189464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syflczuegaigynzhgcyeisghfcwdomfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090432.4962604-2288-168625719786527/AnsiballZ_copy.py'
Oct 10 10:00:33 compute-2 sudo[189464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:33 compute-2 python3.9[189466]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090432.4962604-2288-168625719786527/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:33 compute-2 sudo[189464]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:33.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:33 compute-2 ceph-mon[74913]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Oct 10 10:00:33 compute-2 sudo[189616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emuzimofatkcanoghpobjzuwqhquaoqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090433.583281-2288-161681802084065/AnsiballZ_stat.py'
Oct 10 10:00:33 compute-2 sudo[189616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:33 compute-2 sudo[189619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:00:33 compute-2 sudo[189619]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:33 compute-2 sudo[189619]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:34 compute-2 python3.9[189618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:34 compute-2 sudo[189616]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:34 compute-2 sudo[189764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sayolpjpabjmjdbrlifhqdymjoesxofb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090433.583281-2288-161681802084065/AnsiballZ_copy.py'
Oct 10 10:00:34 compute-2 sudo[189764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:34 compute-2 python3.9[189766]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090433.583281-2288-161681802084065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:34 compute-2 sudo[189764]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:35 compute-2 sudo[189918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnlwaczhwhgvdvferdccyntiotbqftss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090434.782771-2288-92486001447859/AnsiballZ_stat.py'
Oct 10 10:00:35 compute-2 sudo[189918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:35 compute-2 python3.9[189920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:00:35 compute-2 sudo[189918]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:35 compute-2 sudo[190041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrynhvjjhyetpdlwghmtsgcizwwstxjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090434.782771-2288-92486001447859/AnsiballZ_copy.py'
Oct 10 10:00:35 compute-2 sudo[190041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:35.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:35 compute-2 python3.9[190043]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090434.782771-2288-92486001447859/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:35 compute-2 sudo[190041]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:35 compute-2 ceph-mon[74913]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Oct 10 10:00:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:36 compute-2 podman[190069]: 2025-10-10 10:00:36.823373894 +0000 UTC m=+0.099633369 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:00:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:37.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:37 compute-2 ceph-mon[74913]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:00:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:00:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:39 compute-2 python3.9[190223]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:00:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:39.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:39 compute-2 ceph-mon[74913]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 10:00:39 compute-2 sudo[190376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgkvymtubpxvweaqtmguxzjgykbsjkiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090439.4588096-2906-91318284398355/AnsiballZ_seboolean.py'
Oct 10 10:00:39 compute-2 sudo[190376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:40 compute-2 python3.9[190378]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 10 10:00:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:41 compute-2 sudo[190376]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.448 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:00:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.449 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:00:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.449 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:00:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:41 compute-2 sudo[190534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhfzpoxugamztpzbsztgtwyfsbmphbhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090441.5804608-2930-147902236686376/AnsiballZ_copy.py'
Oct 10 10:00:41 compute-2 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 10 10:00:41 compute-2 sudo[190534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:41 compute-2 ceph-mon[74913]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.921276) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090441921343, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4678, "num_deletes": 502, "total_data_size": 12897035, "memory_usage": 13062384, "flush_reason": "Manual Compaction"}
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090441998685, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8357485, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13343, "largest_seqno": 18016, "table_properties": {"data_size": 8339729, "index_size": 12010, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36450, "raw_average_key_size": 19, "raw_value_size": 8303208, "raw_average_value_size": 4480, "num_data_blocks": 525, "num_entries": 1853, "num_filter_entries": 1853, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089994, "oldest_key_time": 1760089994, "file_creation_time": 1760090441, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 77754 microseconds, and 15325 cpu microseconds.
Oct 10 10:00:41 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.999039) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8357485 bytes OK
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.999150) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000747) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000763) EVENT_LOG_v1 {"time_micros": 1760090442000757, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000779) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12876639, prev total WAL file size 12876639, number of live WAL files 2.
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.003972) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8161KB)], [27(12MB)]
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442004031, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21058914, "oldest_snapshot_seqno": -1}
Oct 10 10:00:42 compute-2 python3.9[190536]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:42 compute-2 sudo[190534]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5072 keys, 15514049 bytes, temperature: kUnknown
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442151306, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15514049, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15475480, "index_size": 24763, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12741, "raw_key_size": 126878, "raw_average_key_size": 25, "raw_value_size": 15378919, "raw_average_value_size": 3032, "num_data_blocks": 1042, "num_entries": 5072, "num_filter_entries": 5072, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090442, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.151575) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15514049 bytes
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.153549) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.9 rd, 105.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(8.0, 12.1 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(4.4) write-amplify(1.9) OK, records in: 6094, records dropped: 1022 output_compression: NoCompression
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.153601) EVENT_LOG_v1 {"time_micros": 1760090442153579, "job": 14, "event": "compaction_finished", "compaction_time_micros": 147377, "compaction_time_cpu_micros": 27757, "output_level": 6, "num_output_files": 1, "total_output_size": 15514049, "num_input_records": 6094, "num_output_records": 5072, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442155041, "job": 14, "event": "table_file_deletion", "file_number": 29}
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442156973, "job": 14, "event": "table_file_deletion", "file_number": 27}
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.003884) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:00:42 compute-2 sudo[190687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twrjlpdrqyumqyhtugubkdbmwavbvowa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090442.2801037-2930-228583212496235/AnsiballZ_copy.py'
Oct 10 10:00:42 compute-2 sudo[190687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:42 compute-2 python3.9[190689]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:42 compute-2 sudo[190687]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:43 compute-2 sudo[190840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chlcmbqilzenqyuekliupirabltecynm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090442.8867483-2930-92133704974894/AnsiballZ_copy.py'
Oct 10 10:00:43 compute-2 sudo[190840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:43 compute-2 python3.9[190842]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:43 compute-2 sudo[190840]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:43.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:43 compute-2 sudo[190992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yovnwjkqgekzyofdymanaihixvurcnnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090443.480874-2930-98762967977847/AnsiballZ_copy.py'
Oct 10 10:00:43 compute-2 sudo[190992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:43 compute-2 python3.9[190994]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:00:43 compute-2 sudo[190992]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:00:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:00:43 compute-2 ceph-mon[74913]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Oct 10 10:00:44 compute-2 sudo[191169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzeawlywcwhusulzhuxkoajbnrirnoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090444.0393872-2930-208553744801591/AnsiballZ_copy.py'
Oct 10 10:00:44 compute-2 sudo[191169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:44 compute-2 podman[191130]: 2025-10-10 10:00:44.311853291 +0000 UTC m=+0.060501751 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:00:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d28000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:44 compute-2 python3.9[191175]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:44 compute-2 sudo[191169]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:45 compute-2 sudo[191332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnndbzojopzzxbqejzvggixtwrsubzwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090445.2780838-3038-152979062599696/AnsiballZ_copy.py'
Oct 10 10:00:45 compute-2 sudo[191332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:45.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:45 compute-2 python3.9[191334]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:45 compute-2 sudo[191332]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:45 compute-2 ceph-mon[74913]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 10:00:46 compute-2 sudo[191484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfpikyrbigqkjnoiswzdmpmraludigjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090445.9271128-3038-221681209303953/AnsiballZ_copy.py'
Oct 10 10:00:46 compute-2 sudo[191484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:46 compute-2 python3.9[191486]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:46 compute-2 sudo[191484]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100046 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:46 compute-2 sudo[191637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzxjimsremfqeovamijgrvfllkcxsuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090446.4869246-3038-7671246512014/AnsiballZ_copy.py'
Oct 10 10:00:46 compute-2 sudo[191637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:46 compute-2 python3.9[191639]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:00:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:00:46 compute-2 sudo[191637]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:00:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:47 compute-2 sudo[191790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itjdbtjmjamzduxhwevdiciglpxcemjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090447.0724254-3038-89169293043920/AnsiballZ_copy.py'
Oct 10 10:00:47 compute-2 sudo[191790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:47 compute-2 python3.9[191792]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:47 compute-2 sudo[191790]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:47.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:47 compute-2 sudo[191942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deocxgqeheyyzdesbbwlwcixpwoqwgxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090447.6521935-3038-250602334859576/AnsiballZ_copy.py'
Oct 10 10:00:47 compute-2 sudo[191942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:47 compute-2 ceph-mon[74913]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 10:00:48 compute-2 python3.9[191944]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:48 compute-2 sudo[191942]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:49 compute-2 sudo[192096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwjtyviqeckraxtkgqeopmakyzqjzhbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090449.0316849-3145-20633366212046/AnsiballZ_systemd.py'
Oct 10 10:00:49 compute-2 sudo[192096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:49.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:49 compute-2 python3.9[192098]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:00:49 compute-2 systemd[1]: Reloading.
Oct 10 10:00:49 compute-2 systemd-rc-local-generator[192124]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:00:49 compute-2 systemd-sysv-generator[192128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:00:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:49 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:00:49 compute-2 systemd[1]: Starting libvirt logging daemon socket...
Oct 10 10:00:49 compute-2 systemd[1]: Listening on libvirt logging daemon socket.
Oct 10 10:00:49 compute-2 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 10 10:00:50 compute-2 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 10 10:00:50 compute-2 systemd[1]: Starting libvirt logging daemon...
Oct 10 10:00:50 compute-2 ceph-mon[74913]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Oct 10 10:00:50 compute-2 systemd[1]: Started libvirt logging daemon.
Oct 10 10:00:50 compute-2 sudo[192096]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001720 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:50 compute-2 sudo[192290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyuyvnrplwgjdhlxcqdnowqjohvkcazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090450.254516-3145-173711244397871/AnsiballZ_systemd.py'
Oct 10 10:00:50 compute-2 sudo[192290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:50 compute-2 python3.9[192292]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:00:50 compute-2 systemd[1]: Reloading.
Oct 10 10:00:50 compute-2 systemd-rc-local-generator[192322]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:00:50 compute-2 systemd-sysv-generator[192326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:00:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:51 compute-2 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 10 10:00:51 compute-2 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 10 10:00:51 compute-2 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 10 10:00:51 compute-2 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 10 10:00:51 compute-2 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 10 10:00:51 compute-2 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 10 10:00:51 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Oct 10 10:00:51 compute-2 systemd[1]: Started libvirt nodedev daemon.
Oct 10 10:00:51 compute-2 sudo[192290]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:51 compute-2 sudo[192506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfltunzpxgxijvwkelkmfpyhctddgjbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090451.4168837-3145-209074913310828/AnsiballZ_systemd.py'
Oct 10 10:00:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:51.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:51 compute-2 sudo[192506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:51 compute-2 python3.9[192508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:00:51 compute-2 systemd[1]: Reloading.
Oct 10 10:00:52 compute-2 ceph-mon[74913]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Oct 10 10:00:52 compute-2 systemd-rc-local-generator[192533]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:00:52 compute-2 systemd-sysv-generator[192537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:00:52 compute-2 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 10 10:00:52 compute-2 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 10 10:00:52 compute-2 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 10 10:00:52 compute-2 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 10 10:00:52 compute-2 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 10 10:00:52 compute-2 systemd[1]: Starting libvirt proxy daemon...
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100052 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:00:52 compute-2 systemd[1]: Started libvirt proxy daemon.
Oct 10 10:00:52 compute-2 sudo[192506]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:52 compute-2 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 10 10:00:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:52 compute-2 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 10 10:00:52 compute-2 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 10 10:00:52 compute-2 sudo[192720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icjwojvsiwwclkbzilhkpataewsgpjij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090452.4918277-3145-52732845264380/AnsiballZ_systemd.py'
Oct 10 10:00:52 compute-2 sudo[192720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:53.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:53 compute-2 python3.9[192727]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:00:53 compute-2 systemd[1]: Reloading.
Oct 10 10:00:53 compute-2 systemd-rc-local-generator[192751]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:00:53 compute-2 systemd-sysv-generator[192756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:00:53 compute-2 systemd[1]: Listening on libvirt locking daemon socket.
Oct 10 10:00:53 compute-2 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 10 10:00:53 compute-2 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 10 10:00:53 compute-2 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 10 10:00:53 compute-2 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 10 10:00:53 compute-2 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 10 10:00:53 compute-2 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 10 10:00:53 compute-2 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 10 10:00:53 compute-2 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 10 10:00:53 compute-2 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 10 10:00:53 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Oct 10 10:00:53 compute-2 systemd[1]: Started libvirt QEMU daemon.
Oct 10 10:00:53 compute-2 sudo[192720]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:53 compute-2 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 41dc1415-034a-47c4-9f0f-7f67ccec6a71
Oct 10 10:00:53 compute-2 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 10 10:00:53 compute-2 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 41dc1415-034a-47c4-9f0f-7f67ccec6a71
Oct 10 10:00:53 compute-2 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 10 10:00:53 compute-2 sudo[192940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhnsdfrehfeyqtroagxfuzowfmjclob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090453.6610944-3145-34510403536394/AnsiballZ_systemd.py'
Oct 10 10:00:53 compute-2 sudo[192940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:54 compute-2 sudo[192943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:00:54 compute-2 sudo[192943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:00:54 compute-2 sudo[192943]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:54 compute-2 ceph-mon[74913]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Oct 10 10:00:54 compute-2 python3.9[192942]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:00:54 compute-2 systemd[1]: Reloading.
Oct 10 10:00:54 compute-2 systemd-sysv-generator[192999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:00:54 compute-2 systemd-rc-local-generator[192995]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:00:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:54 compute-2 systemd[1]: Starting libvirt secret daemon socket...
Oct 10 10:00:54 compute-2 systemd[1]: Listening on libvirt secret daemon socket.
Oct 10 10:00:54 compute-2 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 10 10:00:54 compute-2 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 10 10:00:54 compute-2 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 10 10:00:54 compute-2 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 10 10:00:54 compute-2 systemd[1]: Starting libvirt secret daemon...
Oct 10 10:00:54 compute-2 systemd[1]: Started libvirt secret daemon.
Oct 10 10:00:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:54 compute-2 sudo[192940]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:00:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:55.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:00:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:00:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:56 compute-2 ceph-mon[74913]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 10:00:56 compute-2 sudo[193177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixrxmqqibhodybnanxlinaqeoaxjblzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090455.805314-3257-68975820365850/AnsiballZ_file.py'
Oct 10 10:00:56 compute-2 sudo[193177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:56 compute-2 python3.9[193179]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:00:56 compute-2 sudo[193177]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:57.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:57 compute-2 sudo[193331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbwngmwhouedousgkyxylcdiqkjhyfmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090456.5807638-3282-19989399545767/AnsiballZ_find.py'
Oct 10 10:00:57 compute-2 sudo[193331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:57 compute-2 ceph-mon[74913]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 10:00:57 compute-2 python3.9[193333]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 10:00:57 compute-2 sudo[193331]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:57.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:57 compute-2 sudo[193483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsjvxfbrtihcmddfrlqombdwwreebyhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090457.5880659-3305-148779922011174/AnsiballZ_command.py'
Oct 10 10:00:57 compute-2 sudo[193483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:00:58 compute-2 python3.9[193485]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:00:58 compute-2 sudo[193483]: pam_unix(sudo:session): session closed for user root
Oct 10 10:00:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:00:58 compute-2 python3.9[193640]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 10:00:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:59.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:00:59 compute-2 ceph-mon[74913]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 10:00:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:00:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:00:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:59.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:00:59 compute-2 python3.9[193791]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:00 compute-2 auditd[702]: Audit daemon rotating log files
Oct 10 10:01:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc0032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:00 compute-2 python3.9[193912]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090459.4982643-3363-174894785370782/.source.xml follow=False _original_basename=secret.xml.j2 checksum=baa25a2f67c100fe0cd0e069ccc25ef935446dd6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:01.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:01 compute-2 ceph-mon[74913]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:01:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:01:01 compute-2 sudo[194064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjyohbeckinncahrgautbydfioyinbux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090461.3383715-3407-143915715374849/AnsiballZ_command.py'
Oct 10 10:01:01 compute-2 sudo[194064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:01:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:01.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:01:01 compute-2 python3.9[194066]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 21f084a3-af34-5230-afe4-ea5cd24a55f4
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:01:01 compute-2 polkitd[7343]: Registered Authentication Agent for unix-process:194068:336545 (system bus name :1.2007 [/usr/bin/pkttyagent --process 194068 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 10 10:01:01 compute-2 polkitd[7343]: Unregistered Authentication Agent for unix-process:194068:336545 (system bus name :1.2007, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 10 10:01:01 compute-2 polkitd[7343]: Registered Authentication Agent for unix-process:194067:336544 (system bus name :1.2008 [/usr/bin/pkttyagent --process 194067 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 10 10:01:01 compute-2 polkitd[7343]: Unregistered Authentication Agent for unix-process:194067:336544 (system bus name :1.2008, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 10 10:01:01 compute-2 sudo[194064]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:01 compute-2 CROND[194080]: (root) CMD (run-parts /etc/cron.hourly)
Oct 10 10:01:02 compute-2 run-parts[194089]: (/etc/cron.hourly) starting 0anacron
Oct 10 10:01:02 compute-2 anacron[194112]: Anacron started on 2025-10-10
Oct 10 10:01:02 compute-2 anacron[194112]: Will run job `cron.daily' in 46 min.
Oct 10 10:01:02 compute-2 anacron[194112]: Will run job `cron.weekly' in 66 min.
Oct 10 10:01:02 compute-2 anacron[194112]: Will run job `cron.monthly' in 86 min.
Oct 10 10:01:02 compute-2 anacron[194112]: Jobs will be executed sequentially
Oct 10 10:01:02 compute-2 run-parts[194116]: (/etc/cron.hourly) finished 0anacron
Oct 10 10:01:02 compute-2 CROND[194079]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 10 10:01:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc0032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:02 compute-2 python3.9[194244]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:03.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:03 compute-2 sudo[194395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edhdlwsdxqzitvhiwznprkyfcpupcomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090463.1193514-3454-198541988668644/AnsiballZ_command.py'
Oct 10 10:01:03 compute-2 sudo[194395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:03 compute-2 sudo[194395]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:03 compute-2 ceph-mon[74913]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:01:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:03.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:03 compute-2 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 10 10:01:03 compute-2 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 10 10:01:04 compute-2 sudo[194548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucsjtdhxddacjuwexokdytuykakpyutv ; FSID=21f084a3-af34-5230-afe4-ea5cd24a55f4 KEY=AQAP1ehoAAAAABAAt8v7pISuvMofUPTRybMptA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090464.0079463-3478-119433894729613/AnsiballZ_command.py'
Oct 10 10:01:04 compute-2 sudo[194548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100104 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:04 compute-2 polkitd[7343]: Registered Authentication Agent for unix-process:194551:336811 (system bus name :1.2011 [/usr/bin/pkttyagent --process 194551 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 10 10:01:04 compute-2 polkitd[7343]: Unregistered Authentication Agent for unix-process:194551:336811 (system bus name :1.2011, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:04 compute-2 sudo[194548]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:05.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:05.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:05 compute-2 ceph-mon[74913]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:01:06 compute-2 sudo[194708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ludxwhjjekiiylidelhwxqqfrhmzmdfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090465.9221594-3502-53923465736390/AnsiballZ_copy.py'
Oct 10 10:01:06 compute-2 sudo[194708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:06 compute-2 python3.9[194710]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:06 compute-2 sudo[194708]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:07 compute-2 sudo[194873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmxigafmafijqelaokbzsyiibahbhtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090466.7195778-3527-240861844818289/AnsiballZ_stat.py'
Oct 10 10:01:07 compute-2 sudo[194873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:07 compute-2 podman[194836]: 2025-10-10 10:01:07.070992408 +0000 UTC m=+0.102333748 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 10:01:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:07.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:07 compute-2 python3.9[194881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:07 compute-2 sudo[194873]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:07 compute-2 sudo[195011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adzgrqzzglfpjzlornozgjxsubzucdfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090466.7195778-3527-240861844818289/AnsiballZ_copy.py'
Oct 10 10:01:07 compute-2 sudo[195011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:01:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:07.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:01:07 compute-2 ceph-mon[74913]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:01:07 compute-2 python3.9[195013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090466.7195778-3527-240861844818289/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:07 compute-2 sudo[195011]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:08 compute-2 sudo[195164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itschjgqogzbjrthmctcnbtjwdmlcgbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090468.3495054-3575-15202324157252/AnsiballZ_file.py'
Oct 10 10:01:08 compute-2 sudo[195164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:08 compute-2 python3.9[195166]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:08 compute-2 sudo[195164]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:09.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:09 compute-2 sudo[195317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnonbyzolnrabjdaczrsbjvvqtjuyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090469.1854098-3599-74305934945498/AnsiballZ_stat.py'
Oct 10 10:01:09 compute-2 sudo[195317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100109 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:01:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:09.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:09 compute-2 ceph-mon[74913]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:01:09 compute-2 python3.9[195319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:09 compute-2 sudo[195317]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:10 compute-2 sudo[195395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppquyfrbalmvxmlkiydydxvyfpdqsrrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090469.1854098-3599-74305934945498/AnsiballZ_file.py'
Oct 10 10:01:10 compute-2 sudo[195395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:10 compute-2 python3.9[195397]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:10 compute-2 sudo[195395]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:10 compute-2 sudo[195549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-negbrdewjcatbjpvvoemthcpqglraerz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090470.6526597-3635-99070506841841/AnsiballZ_stat.py'
Oct 10 10:01:10 compute-2 sudo[195549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:11.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:11 compute-2 python3.9[195551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:11 compute-2 sudo[195549]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:11 compute-2 sudo[195627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woseuallfpfgnyelmunyjpjegoltqfot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090470.6526597-3635-99070506841841/AnsiballZ_file.py'
Oct 10 10:01:11 compute-2 sudo[195627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:11 compute-2 python3.9[195629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.c9eebzku recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:11 compute-2 sudo[195627]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:11.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:11 compute-2 ceph-mon[74913]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Oct 10 10:01:12 compute-2 sudo[195779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nderkyybywkkhbqhzpflozklrfoaasax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090471.8888535-3670-257172308957541/AnsiballZ_stat.py'
Oct 10 10:01:12 compute-2 sudo[195779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c003cc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:01:12 compute-2 python3.9[195781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:12 compute-2 sudo[195779]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:12 compute-2 sudo[195858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzvstoicihiyyqcghugwlxuppcaeluab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090471.8888535-3670-257172308957541/AnsiballZ_file.py'
Oct 10 10:01:12 compute-2 sudo[195858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:12 compute-2 sudo[195862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:01:12 compute-2 sudo[195862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:12 compute-2 sudo[195862]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:12 compute-2 python3.9[195860]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:12 compute-2 sudo[195887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:01:12 compute-2 sudo[195887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:12 compute-2 sudo[195858]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:13.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:13 compute-2 sudo[195887]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:13 compute-2 sudo[196092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onaergftkopyoymyqzoxvhyfieszzjvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090473.379571-3710-26497174871278/AnsiballZ_command.py'
Oct 10 10:01:13 compute-2 sudo[196092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:13 compute-2 ceph-mon[74913]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:01:13 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:01:13 compute-2 python3.9[196094]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:01:13 compute-2 sudo[196092]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:14 compute-2 sudo[196120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:01:14 compute-2 sudo[196120]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:14 compute-2 sudo[196120]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c003cc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:14 compute-2 sudo[196284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibljgdxzlslcfyofndbrsvkclrwxzzpa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090474.2425318-3734-116837462300948/AnsiballZ_edpm_nftables_from_files.py'
Oct 10 10:01:14 compute-2 sudo[196284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:14 compute-2 podman[196245]: 2025-10-10 10:01:14.698893088 +0000 UTC m=+0.048922408 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 10 10:01:14 compute-2 python3[196292]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 10:01:14 compute-2 sudo[196284]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:15.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:01:15 compute-2 sudo[196444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzggtlpgmjlqppydxhsaexcoinrnlobx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090475.192301-3757-75239076981949/AnsiballZ_stat.py'
Oct 10 10:01:15 compute-2 sudo[196444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:15 compute-2 python3.9[196446]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:15 compute-2 sudo[196444]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:15.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:15 compute-2 ceph-mon[74913]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:15 compute-2 sudo[196522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veqpbmxuvsehclanhxjtzaljaavqvwjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090475.192301-3757-75239076981949/AnsiballZ_file.py'
Oct 10 10:01:15 compute-2 sudo[196522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:01:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:01:16 compute-2 python3.9[196524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:16 compute-2 sudo[196522]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08000fa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:01:16 compute-2 sudo[196679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxjehymfwuueqbpadvcuujrkxyzsnzme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090476.7167964-3793-4246249491304/AnsiballZ_stat.py'
Oct 10 10:01:16 compute-2 sudo[196679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:17.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:17 compute-2 python3.9[196681]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:17 compute-2 sudo[196679]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:17 compute-2 sudo[196757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxhuavaxuovrzckfcncmsnkethetqnxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090476.7167964-3793-4246249491304/AnsiballZ_file.py'
Oct 10 10:01:17 compute-2 sudo[196757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:17 compute-2 python3.9[196759]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:17 compute-2 sudo[196757]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:17 compute-2 ceph-mon[74913]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:18 compute-2 sudo[196909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neunowwfdfhxktggoscnxlwzgcmsfusr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090478.1445465-3830-152518578985330/AnsiballZ_stat.py'
Oct 10 10:01:18 compute-2 sudo[196909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:18 compute-2 python3.9[196911]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:18 compute-2 sudo[196909]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:18 compute-2 sudo[196989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srodfyuqtggcfqaiuaiatmlcrohsnmvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090478.1445465-3830-152518578985330/AnsiballZ_file.py'
Oct 10 10:01:18 compute-2 sudo[196989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:01:18 compute-2 sudo[196991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:01:18 compute-2 sudo[196991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:18 compute-2 sudo[196991]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:19.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:19 compute-2 python3.9[196992]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:19 compute-2 sudo[196989]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:19 compute-2 ceph-mon[74913]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 10:01:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:01:19 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:01:19 compute-2 sudo[197166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsabiubpyoxdybcawigfrfmigvkgfblf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090479.5134094-3866-50144593872174/AnsiballZ_stat.py'
Oct 10 10:01:19 compute-2 sudo[197166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:19 compute-2 python3.9[197168]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:19 compute-2 sudo[197166]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:20 compute-2 sudo[197244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxywkhggyeagfdmzyyfoaqbhhkprjkul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090479.5134094-3866-50144593872174/AnsiballZ_file.py'
Oct 10 10:01:20 compute-2 sudo[197244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:20 compute-2 python3.9[197246]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:20 compute-2 sudo[197244]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08001aa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:21.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:21 compute-2 sudo[197398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hebqbavcjiuwggyllzeqwfncjbenvljf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090480.780365-3902-157953605388440/AnsiballZ_stat.py'
Oct 10 10:01:21 compute-2 sudo[197398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:21 compute-2 python3.9[197400]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:21 compute-2 sudo[197398]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:21 compute-2 sudo[197523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ophnmpzzvgproybfmwtfxgsfubudecrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090480.780365-3902-157953605388440/AnsiballZ_copy.py'
Oct 10 10:01:21 compute-2 sudo[197523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:21.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:21 compute-2 ceph-mon[74913]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 10:01:21 compute-2 python3.9[197525]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090480.780365-3902-157953605388440/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:21 compute-2 sudo[197523]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:01:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08001aa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:22 compute-2 sudo[197676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twmsprdilcnbmdvkavapngixfezwwyxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090482.3528788-3947-263192444988820/AnsiballZ_file.py'
Oct 10 10:01:22 compute-2 sudo[197676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:22 compute-2 python3.9[197678]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:22 compute-2 sudo[197676]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:01:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:01:23 compute-2 sudo[197829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyhxwsfiygecigilrghxlmcufoflcaso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090483.1539497-3971-198570928862907/AnsiballZ_command.py'
Oct 10 10:01:23 compute-2 sudo[197829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:23 compute-2 python3.9[197831]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:01:23 compute-2 sudo[197829]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:23.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:23 compute-2 ceph-mon[74913]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.5 KiB/s wr, 5 op/s
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.788446) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483788495, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 658, "num_deletes": 252, "total_data_size": 1234293, "memory_usage": 1252024, "flush_reason": "Manual Compaction"}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483795751, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 571764, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18021, "largest_seqno": 18674, "table_properties": {"data_size": 568864, "index_size": 872, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7595, "raw_average_key_size": 19, "raw_value_size": 562820, "raw_average_value_size": 1481, "num_data_blocks": 38, "num_entries": 380, "num_filter_entries": 380, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090442, "oldest_key_time": 1760090442, "file_creation_time": 1760090483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7426 microseconds, and 2627 cpu microseconds.
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.795878) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 571764 bytes OK
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.795899) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797328) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797343) EVENT_LOG_v1 {"time_micros": 1760090483797338, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797361) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1230683, prev total WAL file size 1230683, number of live WAL files 2.
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797994) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(558KB)], [30(14MB)]
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483798075, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 16085813, "oldest_snapshot_seqno": -1}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4950 keys, 12217035 bytes, temperature: kUnknown
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483866320, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12217035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12183408, "index_size": 20141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 124728, "raw_average_key_size": 25, "raw_value_size": 12093018, "raw_average_value_size": 2443, "num_data_blocks": 840, "num_entries": 4950, "num_filter_entries": 4950, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.866680) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12217035 bytes
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.868509) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.3 rd, 178.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 14.8 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(49.5) write-amplify(21.4) OK, records in: 5452, records dropped: 502 output_compression: NoCompression
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.868549) EVENT_LOG_v1 {"time_micros": 1760090483868535, "job": 16, "event": "compaction_finished", "compaction_time_micros": 68375, "compaction_time_cpu_micros": 25086, "output_level": 6, "num_output_files": 1, "total_output_size": 12217035, "num_input_records": 5452, "num_output_records": 4950, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483868821, "job": 16, "event": "table_file_deletion", "file_number": 32}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483871760, "job": 16, "event": "table_file_deletion", "file_number": 30}
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797873) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:23 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100124 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:24 compute-2 sudo[197984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfeyttodyasonejnxzkojtyknecuqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090484.0408049-3995-46159216840797/AnsiballZ_blockinfile.py'
Oct 10 10:01:24 compute-2 sudo[197984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:24 compute-2 python3.9[197986]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:24 compute-2 sudo[197984]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d080027b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:01:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:01:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:25 compute-2 sudo[198138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnafyqdbwytrouoommhqlbrrooiyaeep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090485.102383-4022-119568443854477/AnsiballZ_command.py'
Oct 10 10:01:25 compute-2 sudo[198138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:25 compute-2 python3.9[198140]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:01:25 compute-2 sudo[198138]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:25.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:25 compute-2 ceph-mon[74913]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.4 KiB/s wr, 4 op/s
Oct 10 10:01:26 compute-2 sudo[198291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swnsygphcmghouzqmzsuijqzetboythx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090485.9753904-4046-266726637114968/AnsiballZ_stat.py'
Oct 10 10:01:26 compute-2 sudo[198291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:26 compute-2 python3.9[198293]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:01:26 compute-2 sudo[198291]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:27 compute-2 sudo[198447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikwcyrpfcqsdnuggvefolphmievgokpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090486.801089-4070-208646139637019/AnsiballZ_command.py'
Oct 10 10:01:27 compute-2 sudo[198447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:27.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:27 compute-2 python3.9[198449]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:01:27 compute-2 sudo[198447]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:27 compute-2 ceph-mon[74913]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.4 KiB/s wr, 4 op/s
Oct 10 10:01:27 compute-2 sudo[198602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybvbrmxzmbobliaevonksyptjpqnpgye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090487.6968415-4094-133461984534933/AnsiballZ_file.py'
Oct 10 10:01:27 compute-2 sudo[198602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:01:28 compute-2 python3.9[198604]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:28 compute-2 sudo[198602]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08002930 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:28 compute-2 sudo[198756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjpdtkrmgtnzmgzpojcqmsttyprbpikw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090488.543982-4117-275162118658175/AnsiballZ_stat.py'
Oct 10 10:01:28 compute-2 sudo[198756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:29 compute-2 python3.9[198758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:29 compute-2 sudo[198756]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:29 compute-2 sudo[198879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kumtmggnszvxlrlyfkppgiakqdxonhbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090488.543982-4117-275162118658175/AnsiballZ_copy.py'
Oct 10 10:01:29 compute-2 sudo[198879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:29 compute-2 python3.9[198881]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090488.543982-4117-275162118658175/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:29 compute-2 sudo[198879]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:29.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:29 compute-2 ceph-mon[74913]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 1.7 KiB/s wr, 6 op/s
Oct 10 10:01:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:30 compute-2 sudo[199031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffzixxedlppfdnphcmgsqiiophblcbkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090490.080768-4163-241776073727949/AnsiballZ_stat.py'
Oct 10 10:01:30 compute-2 sudo[199031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:30 compute-2 python3.9[199033]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:30 compute-2 sudo[199031]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:30 compute-2 sudo[199156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqgqsczrclceucprpiesjkxpiwjizgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090490.080768-4163-241776073727949/AnsiballZ_copy.py'
Oct 10 10:01:30 compute-2 sudo[199156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:31 compute-2 python3.9[199158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090490.080768-4163-241776073727949/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:31 compute-2 sudo[199156]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:31.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100131 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:01:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:31.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:31 compute-2 sudo[199308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvrlvcjypdoweuuqarvojrmhedgfsbzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090491.5657253-4208-253424999067681/AnsiballZ_stat.py'
Oct 10 10:01:31 compute-2 sudo[199308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:31 compute-2 ceph-mon[74913]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:01:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:01:32 compute-2 python3.9[199310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:01:32 compute-2 sudo[199308]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:32 compute-2 sudo[199431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecipmbidtpmnrawoubtzhjxenetbrdrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090491.5657253-4208-253424999067681/AnsiballZ_copy.py'
Oct 10 10:01:32 compute-2 sudo[199431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:32 compute-2 python3.9[199433]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090491.5657253-4208-253424999067681/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:01:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:32 compute-2 sudo[199431]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:01:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:33.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:01:33 compute-2 sudo[199585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhfzdzeatxixbpbekqcoryjncoffyarl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090493.120267-4253-177117355658992/AnsiballZ_systemd.py'
Oct 10 10:01:33 compute-2 sudo[199585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:33.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:33 compute-2 python3.9[199587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:01:33 compute-2 systemd[1]: Reloading.
Oct 10 10:01:33 compute-2 systemd-rc-local-generator[199606]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:01:33 compute-2 systemd-sysv-generator[199609]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:01:33 compute-2 ceph-mon[74913]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:01:34 compute-2 systemd[1]: Reached target edpm_libvirt.target.
Oct 10 10:01:34 compute-2 sudo[199585]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:34 compute-2 sudo[199651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:01:34 compute-2 sudo[199651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:34 compute-2 sudo[199651]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100134 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:34 compute-2 sudo[199802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knfugvqlftexalsflrxkpbqxbmiddbtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090494.4737997-4277-87587399221656/AnsiballZ_systemd.py'
Oct 10 10:01:34 compute-2 sudo[199802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:34 compute-2 python3.9[199804]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 10:01:35 compute-2 systemd[1]: Reloading.
Oct 10 10:01:35 compute-2 systemd-rc-local-generator[199833]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:01:35 compute-2 systemd-sysv-generator[199836]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:01:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:35.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:35 compute-2 systemd[1]: Reloading.
Oct 10 10:01:35 compute-2 systemd-rc-local-generator[199866]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:01:35 compute-2 systemd-sysv-generator[199872]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:01:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:35.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:35 compute-2 sudo[199802]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:35 compute-2 ceph-mon[74913]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 10:01:36 compute-2 sshd-session[141971]: Connection closed by 192.168.122.30 port 39214
Oct 10 10:01:36 compute-2 sshd-session[141967]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:01:36 compute-2 systemd[1]: session-53.scope: Deactivated successfully.
Oct 10 10:01:36 compute-2 systemd[1]: session-53.scope: Consumed 3min 20.626s CPU time.
Oct 10 10:01:36 compute-2 systemd-logind[796]: Session 53 logged out. Waiting for processes to exit.
Oct 10 10:01:36 compute-2 systemd-logind[796]: Removed session 53.
Oct 10 10:01:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:37.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:37.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:37 compute-2 podman[199905]: 2025-10-10 10:01:37.818555468 +0000 UTC m=+0.096757100 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct 10 10:01:37 compute-2 ceph-mon[74913]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 10:01:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d080042f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:39.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:39.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:39 compute-2 ceph-mon[74913]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 10:01:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:41.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.450 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:01:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.451 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:01:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.451 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:01:41 compute-2 sshd-session[199936]: Accepted publickey for zuul from 192.168.122.30 port 57240 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:01:41 compute-2 systemd-logind[796]: New session 54 of user zuul.
Oct 10 10:01:41 compute-2 systemd[1]: Started Session 54 of User zuul.
Oct 10 10:01:41 compute-2 sshd-session[199936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:01:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:41.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:41 compute-2 ceph-mon[74913]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:42 compute-2 python3.9[200089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:01:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:43.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:43.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:43 compute-2 ceph-mon[74913]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:43 compute-2 sudo[200245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otmrfxjdmiokcllvrkfphpumbeotckxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090503.4455159-64-78737647580736/AnsiballZ_file.py'
Oct 10 10:01:43 compute-2 sudo[200245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:44 compute-2 python3.9[200247]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:01:44 compute-2 sudo[200245]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:44 compute-2 sudo[200398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myyuknuszpnmutkkqjapylhemgwvnahg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090504.3683965-64-244943979735923/AnsiballZ_file.py'
Oct 10 10:01:44 compute-2 sudo[200398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:44 compute-2 python3.9[200400]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:01:44 compute-2 sudo[200398]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:45.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:45 compute-2 sudo[200562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dojhshsxlqacgnpwfgtjvhioieqnslpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090505.0097992-64-196246398211837/AnsiballZ_file.py'
Oct 10 10:01:45 compute-2 sudo[200562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:45 compute-2 podman[200525]: 2025-10-10 10:01:45.271889112 +0000 UTC m=+0.049501196 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 10 10:01:45 compute-2 python3.9[200570]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:01:45 compute-2 sudo[200562]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:45 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:01:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:45 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:01:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:45.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:45 compute-2 sudo[200720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whstiutayuptmjfhwesyalsgsrkboonw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090505.5911238-64-66052356355708/AnsiballZ_file.py'
Oct 10 10:01:45 compute-2 sudo[200720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:45 compute-2 ceph-mon[74913]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:46 compute-2 python3.9[200722]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 10:01:46 compute-2 sudo[200720]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:46 compute-2 sudo[200872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyrfayxavjgddynwtenilpknrgufkqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090506.2206025-64-224087741703394/AnsiballZ_file.py'
Oct 10 10:01:46 compute-2 sudo[200872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:46 compute-2 python3.9[200874]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:01:46 compute-2 sudo[200872]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:01:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:47.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:47.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:47 compute-2 ceph-mon[74913]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:01:48 compute-2 sudo[201026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dosjyxsvopyptxaqzytysvjtnfwwfhba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090507.5699594-173-11670708726987/AnsiballZ_stat.py'
Oct 10 10:01:48 compute-2 sudo[201026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:48 compute-2 python3.9[201028]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:01:48 compute-2 sudo[201026]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:01:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:49.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:49 compute-2 sudo[201184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eumjtiewoorwplerxhsckjwlfwkaefzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090508.5131247-196-142415103401511/AnsiballZ_systemd.py'
Oct 10 10:01:49 compute-2 sudo[201184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:49 compute-2 python3.9[201186]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:01:49 compute-2 systemd[1]: Reloading.
Oct 10 10:01:49 compute-2 systemd-rc-local-generator[201215]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:01:49 compute-2 systemd-sysv-generator[201218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:01:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:49.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:49 compute-2 sudo[201184]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:50 compute-2 ceph-mon[74913]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:01:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:50 compute-2 sudo[201373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqqgaetxbouqqxrmqgmgjidqkiurgsbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090510.1557062-220-245526086744674/AnsiballZ_service_facts.py'
Oct 10 10:01:50 compute-2 sudo[201373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:50 compute-2 python3.9[201375]: ansible-ansible.builtin.service_facts Invoked
Oct 10 10:01:50 compute-2 network[201393]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 10:01:50 compute-2 network[201394]: 'network-scripts' will be removed from distribution in near future.
Oct 10 10:01:50 compute-2 network[201395]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 10:01:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:51.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:01:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:51.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:01:52 compute-2 ceph-mon[74913]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:01:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c0013c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:53.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:53 compute-2 sudo[201373]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:54 compute-2 ceph-mon[74913]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:01:54 compute-2 sudo[201544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:01:54 compute-2 sudo[201544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:01:54 compute-2 sudo[201544]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100154 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:01:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:55.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:56 compute-2 ceph-mon[74913]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:01:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:56 compute-2 sudo[201697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yusgdxwuxrtntrnuqbdfkkknojhuongs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090516.5151265-244-175867053747689/AnsiballZ_systemd.py'
Oct 10 10:01:56 compute-2 sudo[201697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:57 compute-2 python3.9[201699]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:01:57 compute-2 systemd[1]: Reloading.
Oct 10 10:01:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:57.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:57 compute-2 systemd-rc-local-generator[201730]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:01:57 compute-2 systemd-sysv-generator[201733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:01:57 compute-2 sudo[201697]: pam_unix(sudo:session): session closed for user root
Oct 10 10:01:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:01:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:01:58 compute-2 ceph-mon[74913]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:01:58 compute-2 python3.9[201887]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:01:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:01:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:01:59 compute-2 sudo[202039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbjhkbrzxegraecigxlarkmlehsqrfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090518.6423278-296-181881348467658/AnsiballZ_podman_container.py'
Oct 10 10:01:59 compute-2 sudo[202039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:01:59 compute-2 python3.9[202041]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 10:01:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:01:59 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:01:59 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:01:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:01:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:01:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:00 compute-2 ceph-mon[74913]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:02:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:00 compute-2 podman[202053]: 2025-10-10 10:02:00.825899874 +0000 UTC m=+1.355243996 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 10:02:00 compute-2 podman[202114]: 2025-10-10 10:02:00.957548844 +0000 UTC m=+0.042965288 container create 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:02:00 compute-2 NetworkManager[44866]: <info>  [1760090520.9787] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/23)
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered blocking state
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 10:02:01 compute-2 kernel: veth0: entered allmulticast mode
Oct 10 10:02:01 compute-2 kernel: veth0: entered promiscuous mode
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered blocking state
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered forwarding state
Oct 10 10:02:01 compute-2 systemd-udevd[202134]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:02:01 compute-2 systemd-udevd[202132]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0114] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0133] device (veth0): carrier: link connected
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0137] device (podman0): carrier: link connected
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0243] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0262] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0273] device (podman0): Activation: starting connection 'podman0' (35ed1ff3-e2d7-4b9b-a59b-c3bf7706578c)
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0275] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0280] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0284] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0287] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:00.935623716 +0000 UTC m=+0.021040170 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 10:02:01 compute-2 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 10:02:01 compute-2 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0534] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0537] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.0547] device (podman0): Activation: successful, device activated.
Oct 10 10:02:01 compute-2 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 10 10:02:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:01.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:01 compute-2 ceph-mon[74913]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:02:01 compute-2 systemd[1]: Started libpod-conmon-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope.
Oct 10 10:02:01 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:01.28417996 +0000 UTC m=+0.369596424 container init 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:01.290518392 +0000 UTC m=+0.375934836 container start 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:01.293954671 +0000 UTC m=+0.379371115 container attach 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 10 10:02:01 compute-2 iscsid_config[202271]: iqn.1994-05.com.redhat:d6e1178f5fe2
Oct 10 10:02:01 compute-2 systemd[1]: libpod-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope: Deactivated successfully.
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:01.295873362 +0000 UTC m=+0.381289816 container died 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 10:02:01 compute-2 kernel: veth0 (unregistering): left allmulticast mode
Oct 10 10:02:01 compute-2 kernel: veth0 (unregistering): left promiscuous mode
Oct 10 10:02:01 compute-2 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 10:02:01 compute-2 NetworkManager[44866]: <info>  [1760090521.3460] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 10:02:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:01 compute-2 systemd[1]: run-netns-netns\x2de4731ffb\x2d6cfe\x2dede5\x2d6505\x2d7fa8772e9eb3.mount: Deactivated successfully.
Oct 10 10:02:01 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629-userdata-shm.mount: Deactivated successfully.
Oct 10 10:02:01 compute-2 systemd[1]: var-lib-containers-storage-overlay-555c874926332621b9a9c5aa9a07878d9118ea4b416f93c8b49210487c5856c2-merged.mount: Deactivated successfully.
Oct 10 10:02:01 compute-2 podman[202114]: 2025-10-10 10:02:01.660190268 +0000 UTC m=+0.745606712 container remove 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 10:02:01 compute-2 python3.9[202041]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f /usr/sbin/iscsi-iname
Oct 10 10:02:01 compute-2 systemd[1]: libpod-conmon-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope: Deactivated successfully.
Oct 10 10:02:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:01.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:01 compute-2 python3.9[202041]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 10 10:02:01 compute-2 sudo[202039]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:02:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:02 compute-2 sudo[202513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-magtsexfkjlgvvvlmblygyajiynrmvjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090522.5644763-319-93793982205751/AnsiballZ_stat.py'
Oct 10 10:02:02 compute-2 sudo[202513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:03 compute-2 python3.9[202515]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:03 compute-2 sudo[202513]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:03.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:03 compute-2 ceph-mon[74913]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:02:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:03 compute-2 sudo[202636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzpmrqvaaveupqlheuvrukfjeocwmyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090522.5644763-319-93793982205751/AnsiballZ_copy.py'
Oct 10 10:02:03 compute-2 sudo[202636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:03.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:03 compute-2 python3.9[202638]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090522.5644763-319-93793982205751/.source.iscsi _original_basename=.vcm9vftn follow=False checksum=9695408341163c1bdea87fa513eba8362730e33b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:03 compute-2 sudo[202636]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:04 compute-2 sudo[202788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkimmxaddgqkjssvhjsygkkikktjoslw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090524.2173958-364-140747491128207/AnsiballZ_file.py'
Oct 10 10:02:04 compute-2 sudo[202788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:04 compute-2 python3.9[202790]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:04 compute-2 sudo[202788]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:05.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:05 compute-2 python3.9[202942]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:05 compute-2 ceph-mon[74913]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:06 compute-2 sudo[203094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iobyibexrzaafjdlqometgwuxhitvmbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090525.9111335-415-133644117579492/AnsiballZ_lineinfile.py'
Oct 10 10:02:06 compute-2 sudo[203094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:06 compute-2 python3.9[203096]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:06 compute-2 sudo[203094]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:07 compute-2 sudo[203248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whcnoosntxxxulajssafyvrwgwlkihiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090527.0326514-443-132408874738205/AnsiballZ_file.py'
Oct 10 10:02:07 compute-2 sudo[203248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:07 compute-2 python3.9[203250]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:07 compute-2 sudo[203248]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:07 compute-2 ceph-mon[74913]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:07.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:08 compute-2 sudo[203419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyxjcvtdxfrvzhibsuojdgzqjqzombuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090527.8243158-467-81632951421819/AnsiballZ_stat.py'
Oct 10 10:02:08 compute-2 sudo[203419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:08 compute-2 podman[203374]: 2025-10-10 10:02:08.208214368 +0000 UTC m=+0.100544421 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:02:08 compute-2 python3.9[203425]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:08 compute-2 sudo[203419]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:08 compute-2 sudo[203505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mirzpojqfguupzdxkvsosavwiiatqsbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090527.8243158-467-81632951421819/AnsiballZ_file.py'
Oct 10 10:02:08 compute-2 sudo[203505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:08 compute-2 python3.9[203507]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:08 compute-2 sudo[203505]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:09.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:09 compute-2 sudo[203658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztocmvvjinxzqllobtwzqqoftfvpylwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090529.0104892-467-192743789210848/AnsiballZ_stat.py'
Oct 10 10:02:09 compute-2 sudo[203658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:09 compute-2 python3.9[203660]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:09 compute-2 sudo[203658]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:09 compute-2 ceph-mon[74913]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:09.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:09 compute-2 sudo[203736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmnlpzsniksqtextegngwnweewssmihl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090529.0104892-467-192743789210848/AnsiballZ_file.py'
Oct 10 10:02:09 compute-2 sudo[203736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:10 compute-2 python3.9[203738]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:10 compute-2 sudo[203736]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:10 compute-2 sudo[203889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcqoxsoqkzvijvybqopfhdylznmwyegn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090530.5227437-535-32779219559434/AnsiballZ_file.py'
Oct 10 10:02:10 compute-2 sudo[203889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:10 compute-2 python3.9[203891]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:10 compute-2 sudo[203889]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:11 compute-2 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 10:02:11 compute-2 sudo[204042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agjmldntfaehhbgaryjtzenqirqmrgpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090531.2990825-560-270823282252025/AnsiballZ_stat.py'
Oct 10 10:02:11 compute-2 sudo[204042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:11 compute-2 ceph-mon[74913]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:11 compute-2 python3.9[204044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:11 compute-2 sudo[204042]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:12 compute-2 sudo[204120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbpjsaevkekinifuimyeetvlhkwbvbga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090531.2990825-560-270823282252025/AnsiballZ_file.py'
Oct 10 10:02:12 compute-2 sudo[204120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:12 compute-2 python3.9[204122]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:12 compute-2 sudo[204120]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:12 compute-2 sudo[204274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyswrjroaobhutptlrktgtpcqetqqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090532.7150617-597-115330077816953/AnsiballZ_stat.py'
Oct 10 10:02:12 compute-2 sudo[204274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 10:02:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 10:02:13 compute-2 python3.9[204276]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:13 compute-2 sudo[204274]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:13 compute-2 sudo[204352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbbwuuesycgjxhxmtxkqjdghdegarhlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090532.7150617-597-115330077816953/AnsiballZ_file.py'
Oct 10 10:02:13 compute-2 sudo[204352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:13 compute-2 python3.9[204354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:13 compute-2 sudo[204352]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:13 compute-2 ceph-mon[74913]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:02:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:13.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:14 compute-2 sudo[204504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fezmjrascerwxkaexnqzblgycvnkjsun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090534.0851204-632-255573887992534/AnsiballZ_systemd.py'
Oct 10 10:02:14 compute-2 sudo[204504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:14 compute-2 sudo[204507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:02:14 compute-2 sudo[204507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:14 compute-2 sudo[204507]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:14 compute-2 python3.9[204506]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:02:14 compute-2 systemd[1]: Reloading.
Oct 10 10:02:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:14 compute-2 systemd-rc-local-generator[204561]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:02:14 compute-2 systemd-sysv-generator[204565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:02:15 compute-2 sudo[204504]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:15.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:15 compute-2 ceph-mon[74913]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:02:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:15.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:15 compute-2 sudo[204732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dodvsrhdgnxazyfsyijbpvnirbghogmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090535.4967225-656-39364202112344/AnsiballZ_stat.py'
Oct 10 10:02:15 compute-2 sudo[204732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:15 compute-2 podman[204695]: 2025-10-10 10:02:15.781737537 +0000 UTC m=+0.056251841 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:02:15 compute-2 python3.9[204740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:16 compute-2 sudo[204732]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:16 compute-2 sudo[204818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zznyqgibqlwaaaxhwxnuazcikrnenysk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090535.4967225-656-39364202112344/AnsiballZ_file.py'
Oct 10 10:02:16 compute-2 sudo[204818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100216 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:16 compute-2 python3.9[204820]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:16 compute-2 sudo[204818]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:02:17 compute-2 sudo[204972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvumfgkwxksygjzeohcqetchsqosiwxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090536.859305-691-47239842642024/AnsiballZ_stat.py'
Oct 10 10:02:17 compute-2 sudo[204972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:17.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:17 compute-2 python3.9[204974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:17 compute-2 sudo[204972]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:17 compute-2 sudo[205050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yltoxrmrvdqyocllvnsycxlrmevomzyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090536.859305-691-47239842642024/AnsiballZ_file.py'
Oct 10 10:02:17 compute-2 sudo[205050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:17 compute-2 python3.9[205052]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:17 compute-2 sudo[205050]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:02:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:17.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:02:17 compute-2 ceph-mon[74913]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:02:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:18 compute-2 sudo[205202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmhpikcvhiinbvcjvotqdlrvpryiludb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090538.205449-728-129638957588830/AnsiballZ_systemd.py'
Oct 10 10:02:18 compute-2 sudo[205202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:18 compute-2 python3.9[205204]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:02:18 compute-2 systemd[1]: Reloading.
Oct 10 10:02:18 compute-2 systemd-rc-local-generator[205233]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:02:18 compute-2 systemd-sysv-generator[205236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:02:19 compute-2 sudo[205242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:02:19 compute-2 sudo[205242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:19 compute-2 sudo[205242]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:19 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 10:02:19 compute-2 sudo[205269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:02:19 compute-2 sudo[205269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:19 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 10:02:19 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 10:02:19 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 10:02:19 compute-2 sudo[205202]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:19.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:19 compute-2 sudo[205269]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:19 compute-2 ceph-mon[74913]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:02:19 compute-2 sudo[205477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pozhborlsghrjkdbheegvqlnfhsezklb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090539.6873145-758-16820949019243/AnsiballZ_file.py'
Oct 10 10:02:19 compute-2 sudo[205477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:20 compute-2 python3.9[205479]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:20 compute-2 sudo[205477]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:02:20 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:02:20 compute-2 sudo[205631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-greotaltdfojwzqnoznimhrltynujbrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090540.5293305-782-201508999599134/AnsiballZ_stat.py'
Oct 10 10:02:20 compute-2 sudo[205631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:21 compute-2 python3.9[205633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:21 compute-2 sudo[205631]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:21.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:21 compute-2 sudo[205754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwjfistqgyfmlavlsriyacccohoemrox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090540.5293305-782-201508999599134/AnsiballZ_copy.py'
Oct 10 10:02:21 compute-2 sudo[205754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:21 compute-2 python3.9[205756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090540.5293305-782-201508999599134/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:21 compute-2 sudo[205754]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:21.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:22 compute-2 ceph-mon[74913]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:02:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:22 compute-2 sudo[205907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zktscyvyouxobrdaicgjyldeomvdwhor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090542.4170425-833-134129031380685/AnsiballZ_file.py'
Oct 10 10:02:22 compute-2 sudo[205907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:23 compute-2 python3.9[205910]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:23 compute-2 sudo[205907]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:23.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:23 compute-2 sudo[206060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qctogzsbaorhlzxxgupcpgjyuixrwbna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090543.2935624-856-230394218249425/AnsiballZ_stat.py'
Oct 10 10:02:23 compute-2 sudo[206060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:23 compute-2 python3.9[206062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:23 compute-2 sudo[206060]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:23.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:24 compute-2 sudo[206183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcpghkqxiyzsispapgegvzskyqppawcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090543.2935624-856-230394218249425/AnsiballZ_copy.py'
Oct 10 10:02:24 compute-2 sudo[206183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:24 compute-2 ceph-mon[74913]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:02:24 compute-2 python3.9[206185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090543.2935624-856-230394218249425/.source.json _original_basename=.55bfguvo follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:24 compute-2 sudo[206183]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:02:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:25 compute-2 sudo[206337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdeanidmkykvckgzlcpilkcjmerejvyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090544.798389-902-126755360344286/AnsiballZ_file.py'
Oct 10 10:02:25 compute-2 sudo[206337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:25.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:25 compute-2 python3.9[206339]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:25 compute-2 sudo[206337]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:25 compute-2 sudo[206340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:02:25 compute-2 sudo[206340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:25 compute-2 sudo[206340]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:26 compute-2 ceph-mon[74913]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:02:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:02:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:02:26 compute-2 sudo[206514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgtgwtpnoooqonmjqmzkpwdpwwhonaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090545.7930353-925-201258912493785/AnsiballZ_stat.py'
Oct 10 10:02:26 compute-2 sudo[206514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:26 compute-2 sudo[206514]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:26 compute-2 sudo[206638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeqcsbcmrlxtfwgayegwggbtpsnetyve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090545.7930353-925-201258912493785/AnsiballZ_copy.py'
Oct 10 10:02:26 compute-2 sudo[206638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:26 compute-2 sudo[206638]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:27.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:02:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:02:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:27 compute-2 sudo[206791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjdkuwfydfsaojnzzdmwyvspmygtkmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090547.369434-977-241770267095330/AnsiballZ_container_config_data.py'
Oct 10 10:02:27 compute-2 sudo[206791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:28 compute-2 python3.9[206793]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 10 10:02:28 compute-2 sudo[206791]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:28 compute-2 ceph-mon[74913]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:02:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:28 compute-2 sudo[206945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyinblpdqwxbxhbvrucykupbhkunvmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090548.4131935-1004-100627293057842/AnsiballZ_container_config_hash.py'
Oct 10 10:02:28 compute-2 sudo[206945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:29 compute-2 python3.9[206947]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 10:02:29 compute-2 sudo[206945]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:29.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:29.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:30 compute-2 sudo[207097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cadlhxhuljgjoahhwimbqjrvtcxyshcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090549.6292293-1031-48600715497765/AnsiballZ_podman_container_info.py'
Oct 10 10:02:30 compute-2 sudo[207097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:30 compute-2 ceph-mon[74913]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 596 B/s wr, 2 op/s
Oct 10 10:02:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:30 compute-2 python3.9[207099]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 10:02:30 compute-2 sudo[207097]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:02:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:31.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:31.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:32 compute-2 sudo[207278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eicvjlcuefbodyqjufmmtwgznmzznvnq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090551.5393524-1069-266532626532331/AnsiballZ_edpm_container_manage.py'
Oct 10 10:02:32 compute-2 sudo[207278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:32 compute-2 ceph-mon[74913]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 596 B/s wr, 2 op/s
Oct 10 10:02:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:02:32 compute-2 python3[207280]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 10:02:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:32 compute-2 podman[207313]: 2025-10-10 10:02:32.497065971 +0000 UTC m=+0.023537860 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 10:02:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:32 compute-2 podman[207313]: 2025-10-10 10:02:32.650452394 +0000 UTC m=+0.176924263 container create e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:02:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:32 compute-2 python3[207280]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 10:02:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:32 compute-2 sudo[207278]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:33 compute-2 ceph-mon[74913]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:02:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:33.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:33 compute-2 sudo[207503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbyalcmtlrilggzpmzvwbijkxnqxbzrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090553.2459865-1093-251661331000777/AnsiballZ_stat.py'
Oct 10 10:02:33 compute-2 sudo[207503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:33 compute-2 python3.9[207505]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:33 compute-2 sudo[207503]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:33.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:34 compute-2 sudo[207657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkpachiqvvgnuhiqruszfriwkqgqyyeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090554.1550033-1120-15538046364524/AnsiballZ_file.py'
Oct 10 10:02:34 compute-2 sudo[207657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:34 compute-2 sudo[207660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:02:34 compute-2 sudo[207660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:34 compute-2 sudo[207660]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:34 compute-2 python3.9[207659]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:34 compute-2 sudo[207657]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ef0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:34 compute-2 sudo[207760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdwsgsfcdbbjrsxfijsdhqdxlfbhitl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090554.1550033-1120-15538046364524/AnsiballZ_stat.py'
Oct 10 10:02:34 compute-2 sudo[207760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:35 compute-2 python3.9[207762]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:35 compute-2 sudo[207760]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:35.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:35 compute-2 sudo[207911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqzrkvtmynnfzpstcottxscqvcwsricm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090555.1040416-1120-272165618370551/AnsiballZ_copy.py'
Oct 10 10:02:35 compute-2 sudo[207911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:35 compute-2 ceph-mon[74913]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:02:35 compute-2 python3.9[207913]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090555.1040416-1120-272165618370551/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:35 compute-2 sudo[207911]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:35 compute-2 sudo[207987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxcpqurzhiwvvyoimmiawwhjbceqmmbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090555.1040416-1120-272165618370551/AnsiballZ_systemd.py'
Oct 10 10:02:35 compute-2 sudo[207987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:36 compute-2 python3.9[207989]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 10:02:36 compute-2 systemd[1]: Reloading.
Oct 10 10:02:36 compute-2 systemd-rc-local-generator[208014]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:02:36 compute-2 systemd-sysv-generator[208019]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100236 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:36 compute-2 sudo[207987]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:36 compute-2 sudo[208101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deccivulwcaiushfjwglnhndxbssubfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090555.1040416-1120-272165618370551/AnsiballZ_systemd.py'
Oct 10 10:02:36 compute-2 sudo[208101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:37 compute-2 python3.9[208103]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:02:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:37.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:37 compute-2 systemd[1]: Reloading.
Oct 10 10:02:37 compute-2 systemd-rc-local-generator[208132]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:02:37 compute-2 systemd-sysv-generator[208135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:02:37 compute-2 systemd[1]: Starting iscsid container...
Oct 10 10:02:37 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:02:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 10 10:02:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:02:37 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:02:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:37 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0.
Oct 10 10:02:37 compute-2 podman[208142]: 2025-10-10 10:02:37.627090159 +0000 UTC m=+0.108997960 container init e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 10:02:37 compute-2 iscsid[208157]: + sudo -E kolla_set_configs
Oct 10 10:02:37 compute-2 podman[208142]: 2025-10-10 10:02:37.665587274 +0000 UTC m=+0.147495065 container start e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 10:02:37 compute-2 sudo[208163]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 10 10:02:37 compute-2 podman[208142]: iscsid
Oct 10 10:02:37 compute-2 systemd[1]: Started iscsid container.
Oct 10 10:02:37 compute-2 systemd[1]: Created slice User Slice of UID 0.
Oct 10 10:02:37 compute-2 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 10 10:02:37 compute-2 sudo[208101]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:37 compute-2 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 10 10:02:37 compute-2 ceph-mon[74913]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:02:37 compute-2 systemd[1]: Starting User Manager for UID 0...
Oct 10 10:02:37 compute-2 systemd[208177]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 10 10:02:37 compute-2 podman[208164]: 2025-10-10 10:02:37.770914027 +0000 UTC m=+0.096155561 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:02:37 compute-2 systemd[1]: e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0-1fe2500b8305d33.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 10:02:37 compute-2 systemd[1]: e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0-1fe2500b8305d33.service: Failed with result 'exit-code'.
Oct 10 10:02:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:37.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:37 compute-2 systemd[208177]: Queued start job for default target Main User Target.
Oct 10 10:02:37 compute-2 systemd[208177]: Created slice User Application Slice.
Oct 10 10:02:37 compute-2 systemd[208177]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 10 10:02:37 compute-2 systemd[208177]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 10:02:37 compute-2 systemd[208177]: Reached target Paths.
Oct 10 10:02:37 compute-2 systemd[208177]: Reached target Timers.
Oct 10 10:02:37 compute-2 systemd[208177]: Starting D-Bus User Message Bus Socket...
Oct 10 10:02:37 compute-2 systemd[208177]: Starting Create User's Volatile Files and Directories...
Oct 10 10:02:37 compute-2 systemd[208177]: Finished Create User's Volatile Files and Directories.
Oct 10 10:02:37 compute-2 systemd[208177]: Listening on D-Bus User Message Bus Socket.
Oct 10 10:02:37 compute-2 systemd[208177]: Reached target Sockets.
Oct 10 10:02:37 compute-2 systemd[208177]: Reached target Basic System.
Oct 10 10:02:37 compute-2 systemd[208177]: Reached target Main User Target.
Oct 10 10:02:37 compute-2 systemd[208177]: Startup finished in 136ms.
Oct 10 10:02:37 compute-2 systemd[1]: Started User Manager for UID 0.
Oct 10 10:02:37 compute-2 systemd[1]: Started Session c3 of User root.
Oct 10 10:02:37 compute-2 sudo[208163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:02:37 compute-2 iscsid[208157]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 10:02:37 compute-2 iscsid[208157]: INFO:__main__:Validating config file
Oct 10 10:02:37 compute-2 iscsid[208157]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 10:02:37 compute-2 iscsid[208157]: INFO:__main__:Writing out command to execute
Oct 10 10:02:37 compute-2 sudo[208163]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:37 compute-2 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 10 10:02:37 compute-2 iscsid[208157]: ++ cat /run_command
Oct 10 10:02:37 compute-2 iscsid[208157]: + CMD='/usr/sbin/iscsid -f'
Oct 10 10:02:37 compute-2 iscsid[208157]: + ARGS=
Oct 10 10:02:37 compute-2 iscsid[208157]: + sudo kolla_copy_cacerts
Oct 10 10:02:37 compute-2 sudo[208229]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 10 10:02:37 compute-2 systemd[1]: Started Session c4 of User root.
Oct 10 10:02:37 compute-2 sudo[208229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:02:37 compute-2 sudo[208229]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:37 compute-2 iscsid[208157]: + [[ ! -n '' ]]
Oct 10 10:02:37 compute-2 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 10 10:02:37 compute-2 iscsid[208157]: + . kolla_extend_start
Oct 10 10:02:37 compute-2 iscsid[208157]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 10 10:02:37 compute-2 iscsid[208157]: Running command: '/usr/sbin/iscsid -f'
Oct 10 10:02:37 compute-2 iscsid[208157]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 10 10:02:37 compute-2 iscsid[208157]: + umask 0022
Oct 10 10:02:37 compute-2 iscsid[208157]: + exec /usr/sbin/iscsid -f
Oct 10 10:02:38 compute-2 kernel: Loading iSCSI transport class v2.0-870.
Oct 10 10:02:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:38 compute-2 podman[208335]: 2025-10-10 10:02:38.819565363 +0000 UTC m=+0.091346578 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Oct 10 10:02:38 compute-2 python3.9[208385]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:39.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:39 compute-2 sudo[208542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzknvpyzbibrsnxxgodyhfrhilvtgnyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090559.3573005-1232-80995946549546/AnsiballZ_file.py'
Oct 10 10:02:39 compute-2 sudo[208542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:39 compute-2 ceph-mon[74913]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:02:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:39.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:39 compute-2 python3.9[208544]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:39 compute-2 sudo[208542]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:40 compute-2 sudo[208695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqrbjwkbzycolvzwgyvoojkjtamyjcmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090560.4593701-1265-150276541528352/AnsiballZ_service_facts.py'
Oct 10 10:02:40 compute-2 sudo[208695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:41 compute-2 python3.9[208697]: ansible-ansible.builtin.service_facts Invoked
Oct 10 10:02:41 compute-2 network[208715]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 10:02:41 compute-2 network[208716]: 'network-scripts' will be removed from distribution in near future.
Oct 10 10:02:41 compute-2 network[208717]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 10:02:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:41.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.452 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:02:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:02:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:02:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:41 compute-2 ceph-mon[74913]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:02:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:41.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:02:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:43.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:02:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:43 compute-2 ceph-mon[74913]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:02:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:43.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:44 compute-2 sudo[208695]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:45.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:45 compute-2 ceph-mon[74913]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:02:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:45.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:46 compute-2 podman[208970]: 2025-10-10 10:02:46.740102227 +0000 UTC m=+0.057779400 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:02:46 compute-2 sudo[209014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpvoenhyrlhyfznrjlzpymieppckdinq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090566.4498675-1295-33835508563895/AnsiballZ_file.py'
Oct 10 10:02:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:46 compute-2 sudo[209014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:02:46 compute-2 python3.9[209018]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 10:02:46 compute-2 sudo[209014]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:47.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:47 compute-2 sudo[209169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpqijgellljbuzcsgtrlrcwfpuddgmqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090567.187343-1318-108404414083090/AnsiballZ_modprobe.py'
Oct 10 10:02:47 compute-2 sudo[209169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:47 compute-2 ceph-mon[74913]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:02:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:47.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:47 compute-2 python3.9[209171]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 10 10:02:47 compute-2 sudo[209169]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:48 compute-2 systemd[1]: Stopping User Manager for UID 0...
Oct 10 10:02:48 compute-2 systemd[208177]: Activating special unit Exit the Session...
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped target Main User Target.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped target Basic System.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped target Paths.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped target Sockets.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped target Timers.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 10:02:48 compute-2 systemd[208177]: Closed D-Bus User Message Bus Socket.
Oct 10 10:02:48 compute-2 systemd[208177]: Stopped Create User's Volatile Files and Directories.
Oct 10 10:02:48 compute-2 systemd[208177]: Removed slice User Application Slice.
Oct 10 10:02:48 compute-2 systemd[208177]: Reached target Shutdown.
Oct 10 10:02:48 compute-2 systemd[208177]: Finished Exit the Session.
Oct 10 10:02:48 compute-2 systemd[208177]: Reached target Exit the Session.
Oct 10 10:02:48 compute-2 systemd[1]: user@0.service: Deactivated successfully.
Oct 10 10:02:48 compute-2 systemd[1]: Stopped User Manager for UID 0.
Oct 10 10:02:48 compute-2 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 10 10:02:48 compute-2 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 10 10:02:48 compute-2 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 10 10:02:48 compute-2 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 10 10:02:48 compute-2 systemd[1]: Removed slice User Slice of UID 0.
Oct 10 10:02:48 compute-2 sudo[209326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmwoqjebsenofqpouywaaduhkzpnolwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090568.1586635-1343-220083241216947/AnsiballZ_stat.py'
Oct 10 10:02:48 compute-2 sudo[209326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:48 compute-2 python3.9[209328]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:48 compute-2 sudo[209326]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:49 compute-2 sudo[209452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arholobhvjywvlkhakfdkaboewlnapcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090568.1586635-1343-220083241216947/AnsiballZ_copy.py'
Oct 10 10:02:49 compute-2 sudo[209452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:02:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:49.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:02:49 compute-2 python3.9[209454]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090568.1586635-1343-220083241216947/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:49 compute-2 sudo[209452]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:49 compute-2 ceph-mon[74913]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:02:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:49.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:50 compute-2 sudo[209604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzcjpdayldoyrbmchbticscvqyfozymc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090569.7532947-1390-119581338195338/AnsiballZ_lineinfile.py'
Oct 10 10:02:50 compute-2 sudo[209604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:50 compute-2 python3.9[209606]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:50 compute-2 sudo[209604]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003fb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:50 compute-2 sudo[209758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krprorbflnmbdoszifewffnbvcblzjfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090570.5632439-1414-175001290411726/AnsiballZ_systemd.py'
Oct 10 10:02:50 compute-2 sudo[209758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:51 compute-2 python3.9[209760]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:02:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:51.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:51 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 10 10:02:51 compute-2 systemd[1]: Stopped Load Kernel Modules.
Oct 10 10:02:51 compute-2 systemd[1]: Stopping Load Kernel Modules...
Oct 10 10:02:51 compute-2 systemd[1]: Starting Load Kernel Modules...
Oct 10 10:02:51 compute-2 systemd[1]: Finished Load Kernel Modules.
Oct 10 10:02:51 compute-2 sudo[209758]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:51 compute-2 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 10 10:02:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:51.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:51 compute-2 ceph-mon[74913]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:51 compute-2 sudo[209915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wngxgxldexwajfdgdawrhnsicovtjydc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090571.7009273-1439-165048431875434/AnsiballZ_file.py'
Oct 10 10:02:51 compute-2 sudo[209915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:52 compute-2 python3.9[209917]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:02:52 compute-2 sudo[209915]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:52 compute-2 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 10 10:02:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:52 compute-2 sudo[210070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjqjwimkibhjszxnolextqwocnnvdkkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090572.6301444-1466-94700956805834/AnsiballZ_stat.py'
Oct 10 10:02:52 compute-2 sudo[210070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:53 compute-2 python3.9[210072]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:53 compute-2 sudo[210070]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:53 compute-2 sudo[210222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mybaidkatxocubgugrndjlpfjvrmqjxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090573.5513668-1492-92324652617390/AnsiballZ_stat.py'
Oct 10 10:02:53 compute-2 sudo[210222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:53.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:53 compute-2 ceph-mon[74913]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:02:54 compute-2 python3.9[210224]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:02:54 compute-2 sudo[210222]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003fd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:54 compute-2 sudo[210374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nskuavjjvztpwlqmskoreiaunuhsfebe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090574.301278-1516-124561907368109/AnsiballZ_stat.py'
Oct 10 10:02:54 compute-2 sudo[210374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:54 compute-2 sudo[210378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:02:54 compute-2 sudo[210378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:02:54 compute-2 sudo[210378]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:54 compute-2 python3.9[210377]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:02:54 compute-2 sudo[210374]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:55 compute-2 sudo[210524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grkfdmyektznqqxtckrpuangpvyqlmjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090574.301278-1516-124561907368109/AnsiballZ_copy.py'
Oct 10 10:02:55 compute-2 sudo[210524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:02:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:55.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:55 compute-2 python3.9[210526]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090574.301278-1516-124561907368109/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:55 compute-2 sudo[210524]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:55.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:55 compute-2 ceph-mon[74913]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:56 compute-2 sudo[210676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdmqqixczvxffqbinsdfzhxvlctopmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090575.804849-1562-90793825555273/AnsiballZ_command.py'
Oct 10 10:02:56 compute-2 sudo[210676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:56 compute-2 python3.9[210678]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:02:56 compute-2 sudo[210676]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:56 compute-2 sudo[210831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulffuaiuseaohymmrnetycxlpitmjruo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090576.6743143-1586-258256336071948/AnsiballZ_lineinfile.py'
Oct 10 10:02:56 compute-2 sudo[210831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:57 compute-2 python3.9[210833]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:57 compute-2 sudo[210831]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:57.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:02:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:57.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:02:57 compute-2 ceph-mon[74913]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:02:57 compute-2 sudo[210983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxmrjvvvjwpnppypqrtixekhjhraplap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090577.4560788-1610-3361683989366/AnsiballZ_replace.py'
Oct 10 10:02:57 compute-2 sudo[210983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:58 compute-2 python3.9[210985]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:58 compute-2 sudo[210983]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:58 compute-2 sudo[211136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pydicnjkcdbidohpwyzrruzvmfanipzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090578.4125261-1634-180446859534921/AnsiballZ_replace.py'
Oct 10 10:02:58 compute-2 sudo[211136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004010 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:02:58 compute-2 python3.9[211138]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:58 compute-2 sudo[211136]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.003000096s ======
Oct 10 10:02:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:59.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000096s
Oct 10 10:02:59 compute-2 sudo[211289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nksinftrlonwjuzfoeerikktiutpiiet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090579.2599075-1660-221577868957377/AnsiballZ_lineinfile.py'
Oct 10 10:02:59 compute-2 sudo[211289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:02:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:02:59 compute-2 python3.9[211291]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:02:59 compute-2 sudo[211289]: pam_unix(sudo:session): session closed for user root
Oct 10 10:02:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:02:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:02:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:59.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:02:59 compute-2 ceph-mon[74913]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 10:03:00 compute-2 sudo[211441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odnvefhvbbwreakrtmntsblgbyqswbip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090579.8503964-1660-47408524025833/AnsiballZ_lineinfile.py'
Oct 10 10:03:00 compute-2 sudo[211441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:00 compute-2 python3.9[211443]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:00 compute-2 sudo[211441]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:00 compute-2 sudo[211594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtqtbvyjdsrgqapctyoybnmlfmuvgmci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090580.4624524-1660-243154276166906/AnsiballZ_lineinfile.py'
Oct 10 10:03:00 compute-2 sudo[211594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:00 compute-2 python3.9[211596]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:00 compute-2 sudo[211594]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:01.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:01 compute-2 sudo[211747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzyjlmzglhhirdnppadizgyywqpfcxwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090581.130669-1660-89812906608886/AnsiballZ_lineinfile.py'
Oct 10 10:03:01 compute-2 sudo[211747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:01 compute-2 python3.9[211749]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:01 compute-2 sudo[211747]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:01.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:01 compute-2 ceph-mon[74913]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:03:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:03:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004030 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:02 compute-2 sudo[211900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sajykdceaihymkhfkfezjnetixmcppmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090582.2900906-1749-7751638698881/AnsiballZ_stat.py'
Oct 10 10:03:02 compute-2 sudo[211900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:02 compute-2 python3.9[211902]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:03:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:02 compute-2 sudo[211900]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:03 compute-2 sudo[212055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ircibvodhirbcvaclyabteduuonopuce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090583.0894365-1772-261213598497931/AnsiballZ_file.py'
Oct 10 10:03:03 compute-2 sudo[212055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:03 compute-2 python3.9[212057]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:03 compute-2 sudo[212055]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:03.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:03 compute-2 ceph-mon[74913]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:03:04 compute-2 sudo[212207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlndlmfzlppclnhawsumsequveoejcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090584.0205827-1799-199252869168802/AnsiballZ_file.py'
Oct 10 10:03:04 compute-2 sudo[212207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:04 compute-2 python3.9[212209]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:04 compute-2 sudo[212207]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:04 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 10:03:04 compute-2 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 10 10:03:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:05 compute-2 sudo[212363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavstclxxxpvevxbiuownzrmyzlmywwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090584.8588085-1823-126463547512746/AnsiballZ_stat.py'
Oct 10 10:03:05 compute-2 sudo[212363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:05.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:05 compute-2 python3.9[212365]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:05 compute-2 sudo[212363]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:05 compute-2 sudo[212441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irzyjjwrsqbmfbllrzgvrjxehmrxzumc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090584.8588085-1823-126463547512746/AnsiballZ_file.py'
Oct 10 10:03:05 compute-2 sudo[212441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:05 compute-2 python3.9[212443]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:05 compute-2 sudo[212441]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:05 compute-2 ceph-mon[74913]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:03:06 compute-2 sudo[212593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pytausaaalxwdyywdovmbhfptyebnaas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090585.8827758-1823-55686084011443/AnsiballZ_stat.py'
Oct 10 10:03:06 compute-2 sudo[212593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:06 compute-2 python3.9[212595]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:06 compute-2 sudo[212593]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:06 compute-2 sudo[212672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrdzltodjzhewcponqhevnanwulefsvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090585.8827758-1823-55686084011443/AnsiballZ_file.py'
Oct 10 10:03:06 compute-2 sudo[212672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:06 compute-2 python3.9[212674]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:06 compute-2 sudo[212672]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:07.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:07 compute-2 sudo[212825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsbblvntraryayzvypjsuloisklueeph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090587.4627337-1891-90570510000633/AnsiballZ_file.py'
Oct 10 10:03:07 compute-2 sudo[212825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:07.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:07 compute-2 python3.9[212827]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:07 compute-2 sudo[212825]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:07 compute-2 ceph-mon[74913]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:03:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:08 compute-2 sudo[212988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuztopyzquzsgjhjzddmsphzndqvnkql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090588.278175-1916-34228590121573/AnsiballZ_stat.py'
Oct 10 10:03:08 compute-2 sudo[212988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:08 compute-2 podman[212951]: 2025-10-10 10:03:08.555649342 +0000 UTC m=+0.055516564 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:03:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:08 compute-2 python3.9[212997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:08 compute-2 sudo[212988]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:08 compute-2 sudo[213091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsubncqgdxfahpjylukruafqgpliycpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090588.278175-1916-34228590121573/AnsiballZ_file.py'
Oct 10 10:03:08 compute-2 sudo[213091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:08 compute-2 podman[213051]: 2025-10-10 10:03:08.990745599 +0000 UTC m=+0.068122357 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:03:09 compute-2 python3.9[213099]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:09 compute-2 sudo[213091]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:09.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100309 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:03:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:09 compute-2 sudo[213255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggjuocpphthzcznvenfahcvrdzzcmvld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090589.6274388-1952-257857778077205/AnsiballZ_stat.py'
Oct 10 10:03:09 compute-2 sudo[213255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:10 compute-2 ceph-mon[74913]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 10:03:10 compute-2 python3.9[213257]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:10 compute-2 sudo[213255]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:10 compute-2 sudo[213333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzgtocxlsahqszkxfltayqyedmfhsrjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090589.6274388-1952-257857778077205/AnsiballZ_file.py'
Oct 10 10:03:10 compute-2 sudo[213333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:10 compute-2 python3.9[213335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:10 compute-2 sudo[213333]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:11.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:11 compute-2 sudo[213487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvwmhtkngodblzqglnxlupcquaowmtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090590.9792964-1988-274749202516598/AnsiballZ_systemd.py'
Oct 10 10:03:11 compute-2 sudo[213487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:11 compute-2 python3.9[213489]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:03:11 compute-2 systemd[1]: Reloading.
Oct 10 10:03:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:11 compute-2 systemd-rc-local-generator[213517]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:11 compute-2 systemd-sysv-generator[213520]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:12 compute-2 sudo[213487]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:12 compute-2 ceph-mon[74913]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:03:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:12 compute-2 sudo[213676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfpbqdssotrijlafplqapqsqdyuflyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090592.2605777-2012-109765315399982/AnsiballZ_stat.py'
Oct 10 10:03:12 compute-2 sudo[213676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:12 compute-2 python3.9[213679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:12 compute-2 sudo[213676]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:12 compute-2 sudo[213756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxzzijrrdmqyroqpwkhbworsogkolpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090592.2605777-2012-109765315399982/AnsiballZ_file.py'
Oct 10 10:03:13 compute-2 sudo[213756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:13 compute-2 python3.9[213758]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:13.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:13 compute-2 sudo[213756]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.831581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593831641, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1233, "num_deletes": 254, "total_data_size": 2977320, "memory_usage": 3016544, "flush_reason": "Manual Compaction"}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Oct 10 10:03:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593859866, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1967335, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18679, "largest_seqno": 19907, "table_properties": {"data_size": 1962011, "index_size": 2784, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 10656, "raw_average_key_size": 18, "raw_value_size": 1951470, "raw_average_value_size": 3387, "num_data_blocks": 125, "num_entries": 576, "num_filter_entries": 576, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090484, "oldest_key_time": 1760090484, "file_creation_time": 1760090593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 28376 microseconds, and 5037 cpu microseconds.
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.859953) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1967335 bytes OK
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.859987) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.862945) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.862984) EVENT_LOG_v1 {"time_micros": 1760090593862974, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.863018) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2971483, prev total WAL file size 2971747, number of live WAL files 2.
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.864222) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1921KB)], [33(11MB)]
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593864308, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14184370, "oldest_snapshot_seqno": -1}
Oct 10 10:03:13 compute-2 sudo[213908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixacwnifjeqsmnrxmymmkxlhmoxstllk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090593.642036-2048-230376664482170/AnsiballZ_stat.py'
Oct 10 10:03:13 compute-2 sudo[213908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 5004 keys, 13702762 bytes, temperature: kUnknown
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593965892, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13702762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13667797, "index_size": 21351, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126959, "raw_average_key_size": 25, "raw_value_size": 13575589, "raw_average_value_size": 2712, "num_data_blocks": 878, "num_entries": 5004, "num_filter_entries": 5004, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.966355) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13702762 bytes
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.970814) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.2 rd, 134.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.7 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(14.2) write-amplify(7.0) OK, records in: 5526, records dropped: 522 output_compression: NoCompression
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.970848) EVENT_LOG_v1 {"time_micros": 1760090593970841, "job": 18, "event": "compaction_finished", "compaction_time_micros": 101863, "compaction_time_cpu_micros": 26386, "output_level": 6, "num_output_files": 1, "total_output_size": 13702762, "num_input_records": 5526, "num_output_records": 5004, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593971205, "job": 18, "event": "table_file_deletion", "file_number": 35}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593973249, "job": 18, "event": "table_file_deletion", "file_number": 33}
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.864048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:13 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:03:14 compute-2 ceph-mon[74913]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 10:03:14 compute-2 python3.9[213910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:14 compute-2 sudo[213908]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:14 compute-2 sudo[213986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwcmxueplbhigghrruoqqnkjbtedatf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090593.642036-2048-230376664482170/AnsiballZ_file.py'
Oct 10 10:03:14 compute-2 sudo[213986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:14 compute-2 python3.9[213988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:14 compute-2 sudo[213986]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:14 compute-2 sudo[214014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:03:14 compute-2 sudo[214014]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:14 compute-2 sudo[214014]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:15.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:15 compute-2 sudo[214165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frxpvjhoyxaehbtfkqrxwcfuzhbleaex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090594.9630127-2084-45720809086737/AnsiballZ_systemd.py'
Oct 10 10:03:15 compute-2 sudo[214165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:15 compute-2 python3.9[214167]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:03:15 compute-2 systemd[1]: Reloading.
Oct 10 10:03:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:15 compute-2 systemd-rc-local-generator[214197]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:15 compute-2 systemd-sysv-generator[214201]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:15 compute-2 systemd[1]: Starting Create netns directory...
Oct 10 10:03:15 compute-2 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 10:03:15 compute-2 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 10:03:15 compute-2 systemd[1]: Finished Create netns directory.
Oct 10 10:03:15 compute-2 sudo[214165]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:16 compute-2 ceph-mon[74913]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:03:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:16 compute-2 podman[214333]: 2025-10-10 10:03:16.878018299 +0000 UTC m=+0.066782695 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Oct 10 10:03:16 compute-2 sudo[214377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnqxbhbdmpbmwziongozjsgnhijmlpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090596.5256302-2114-257418503909341/AnsiballZ_file.py'
Oct 10 10:03:16 compute-2 sudo[214377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:03:17 compute-2 python3.9[214381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:17 compute-2 sudo[214377]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:17.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:17 compute-2 sudo[214531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpkxwkzematchwetmanyuixwvpesqjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090597.4045062-2137-14079851584351/AnsiballZ_stat.py'
Oct 10 10:03:17 compute-2 sudo[214531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:17.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:17 compute-2 python3.9[214533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:17 compute-2 sudo[214531]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:18 compute-2 ceph-mon[74913]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:03:18 compute-2 sudo[214654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djtkosptgmzrbsmhwzkkdvsrhrcbyfwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090597.4045062-2137-14079851584351/AnsiballZ_copy.py'
Oct 10 10:03:18 compute-2 sudo[214654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:18 compute-2 python3.9[214656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090597.4045062-2137-14079851584351/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:18 compute-2 sudo[214654]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:19.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:19 compute-2 sudo[214809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goryrkulzfxdxjisrmislxsziyefppdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090599.212284-2189-135311927058661/AnsiballZ_file.py'
Oct 10 10:03:19 compute-2 sudo[214809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:19 compute-2 python3.9[214811]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:03:19 compute-2 sudo[214809]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:19.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:20 compute-2 ceph-mon[74913]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:03:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:20 compute-2 sudo[214961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhozoqdibgowpbixrnwonzmyljtjpzqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090600.1159778-2212-347685999192/AnsiballZ_stat.py'
Oct 10 10:03:20 compute-2 sudo[214961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:20 compute-2 python3.9[214963]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:20 compute-2 sudo[214961]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:20 compute-2 sudo[215087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhkihwthnwlazhezxmhujnehxzkwlobx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090600.1159778-2212-347685999192/AnsiballZ_copy.py'
Oct 10 10:03:20 compute-2 sudo[215087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:21 compute-2 python3.9[215089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090600.1159778-2212-347685999192/.source.json _original_basename=.gg2mztjk follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:21 compute-2 sudo[215087]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:21.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:03:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:03:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:21.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:21 compute-2 sudo[215239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drlcymigfoizztrqseirdwjncjxzicuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090601.5436616-2258-29927712581053/AnsiballZ_file.py'
Oct 10 10:03:21 compute-2 sudo[215239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:22 compute-2 ceph-mon[74913]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:03:22 compute-2 python3.9[215241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:22 compute-2 sudo[215239]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:22 compute-2 sudo[215393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enluysyhppgmoztignbcvqcvpeqifakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090602.532626-2281-187788850810892/AnsiballZ_stat.py'
Oct 10 10:03:22 compute-2 sudo[215393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:23 compute-2 sudo[215393]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:23 compute-2 ceph-mon[74913]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:03:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:23.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:23 compute-2 sudo[215516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbakyufbflzqlpizivmbbspxdenezcvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090602.532626-2281-187788850810892/AnsiballZ_copy.py'
Oct 10 10:03:23 compute-2 sudo[215516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:23 compute-2 sudo[215516]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:23.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:24 compute-2 sudo[215668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trwagynademxvlmfmtidvwweojccrvwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090604.1591237-2333-62461922343820/AnsiballZ_container_config_data.py'
Oct 10 10:03:24 compute-2 sudo[215668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:24 compute-2 python3.9[215670]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 10 10:03:24 compute-2 sudo[215668]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:25.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:25 compute-2 sudo[215822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyfsikkfsxykdwyxtkcfsuwznskvqetf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090605.1234694-2360-212333631401709/AnsiballZ_container_config_hash.py'
Oct 10 10:03:25 compute-2 sudo[215822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:25 compute-2 sudo[215823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:03:25 compute-2 sudo[215823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:25 compute-2 sudo[215823]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:25 compute-2 sudo[215850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:03:25 compute-2 sudo[215850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:25 compute-2 python3.9[215828]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 10:03:25 compute-2 sudo[215822]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:25 compute-2 ceph-mon[74913]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:03:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:25.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:26 compute-2 sudo[215850]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:26 compute-2 sudo[216054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfdrongkbpjrxztrsoqvgzxerlveouz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090606.049352-2387-192927510573322/AnsiballZ_podman_container_info.py'
Oct 10 10:03:26 compute-2 sudo[216054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100326 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:03:26 compute-2 python3.9[216056]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:03:26 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:03:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:26 compute-2 sudo[216054]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:27.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:27 compute-2 ceph-mon[74913]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:03:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:27.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:28 compute-2 sudo[216235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvmfihozaxqqqkqiazmzjhtsbrvjuyie ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090608.1332107-2426-18576639174284/AnsiballZ_edpm_container_manage.py'
Oct 10 10:03:28 compute-2 sudo[216235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:28 compute-2 python3[216237]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 10:03:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:29.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:29 compute-2 ceph-mon[74913]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:03:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100329 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:03:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:29.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:29 compute-2 podman[216251]: 2025-10-10 10:03:29.945721547 +0000 UTC m=+1.167097737 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 10:03:30 compute-2 podman[216309]: 2025-10-10 10:03:30.07231354 +0000 UTC m=+0.045041380 container create 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct 10 10:03:30 compute-2 podman[216309]: 2025-10-10 10:03:30.050820803 +0000 UTC m=+0.023548663 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 10:03:30 compute-2 python3[216237]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 10:03:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:30 compute-2 sudo[216235]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:31 compute-2 sudo[216499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydjtwwsjullkqvaftbqmxznsgvymdjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090610.7347383-2450-150609107106233/AnsiballZ_stat.py'
Oct 10 10:03:31 compute-2 sudo[216499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:31 compute-2 python3.9[216501]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:03:31 compute-2 sudo[216499]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:31.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:31 compute-2 ceph-mon[74913]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:03:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:03:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:31 compute-2 sudo[216653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olssxqguqkmhqmtznbusrygcuspeczuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090611.6751363-2476-152107787031755/AnsiballZ_file.py'
Oct 10 10:03:31 compute-2 sudo[216653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:32 compute-2 python3.9[216655]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:32 compute-2 sudo[216653]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:32 compute-2 sudo[216679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:03:32 compute-2 sudo[216679]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:32 compute-2 sudo[216679]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:32 compute-2 sudo[216754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agmozyyjizszzipbbjpellnzusntbfcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090611.6751363-2476-152107787031755/AnsiballZ_stat.py'
Oct 10 10:03:32 compute-2 sudo[216754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:32 compute-2 python3.9[216756]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:03:32 compute-2 sudo[216754]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:03:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:03:33 compute-2 sudo[216907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kblluzrxlwywqwuadopadlsflpsreyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090612.642054-2476-194565114366614/AnsiballZ_copy.py'
Oct 10 10:03:33 compute-2 sudo[216907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:33.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:33 compute-2 python3.9[216909]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090612.642054-2476-194565114366614/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:33 compute-2 sudo[216907]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:33 compute-2 sudo[216983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-povodwsrpmooczpmumvsgxcratpcclwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090612.642054-2476-194565114366614/AnsiballZ_systemd.py'
Oct 10 10:03:33 compute-2 sudo[216983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:33 compute-2 python3.9[216985]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 10:03:33 compute-2 systemd[1]: Reloading.
Oct 10 10:03:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:33.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:33 compute-2 systemd-rc-local-generator[217010]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:33 compute-2 systemd-sysv-generator[217014]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:34 compute-2 ceph-mon[74913]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:03:34 compute-2 sudo[216983]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:34 compute-2 sudo[217094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfgnmrmrbddesjpcfvaaxfhvhsznqbln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090612.642054-2476-194565114366614/AnsiballZ_systemd.py'
Oct 10 10:03:34 compute-2 sudo[217094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:34 compute-2 python3.9[217096]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:03:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:34 compute-2 systemd[1]: Reloading.
Oct 10 10:03:34 compute-2 systemd-sysv-generator[217154]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:34 compute-2 systemd-rc-local-generator[217151]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:35 compute-2 sudo[217102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:03:35 compute-2 sudo[217102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:35 compute-2 sudo[217102]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:35 compute-2 ceph-mon[74913]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:03:35 compute-2 systemd[1]: Starting multipathd container...
Oct 10 10:03:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:35 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:03:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:35.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 10:03:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:03:35 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 10:03:35 compute-2 podman[217162]: 2025-10-10 10:03:35.270851204 +0000 UTC m=+0.116284925 container init 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct 10 10:03:35 compute-2 multipathd[217177]: + sudo -E kolla_set_configs
Oct 10 10:03:35 compute-2 sudo[217183]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 10 10:03:35 compute-2 sudo[217183]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 10 10:03:35 compute-2 sudo[217183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:03:35 compute-2 podman[217162]: 2025-10-10 10:03:35.300557303 +0000 UTC m=+0.145991004 container start 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:03:35 compute-2 podman[217162]: multipathd
Oct 10 10:03:35 compute-2 systemd[1]: Started multipathd container.
Oct 10 10:03:35 compute-2 multipathd[217177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 10:03:35 compute-2 multipathd[217177]: INFO:__main__:Validating config file
Oct 10 10:03:35 compute-2 multipathd[217177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 10:03:35 compute-2 multipathd[217177]: INFO:__main__:Writing out command to execute
Oct 10 10:03:35 compute-2 sudo[217183]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:35 compute-2 multipathd[217177]: ++ cat /run_command
Oct 10 10:03:35 compute-2 multipathd[217177]: + CMD='/usr/sbin/multipathd -d'
Oct 10 10:03:35 compute-2 multipathd[217177]: + ARGS=
Oct 10 10:03:35 compute-2 multipathd[217177]: + sudo kolla_copy_cacerts
Oct 10 10:03:35 compute-2 sudo[217094]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:35 compute-2 sudo[217205]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 10 10:03:35 compute-2 sudo[217205]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 10 10:03:35 compute-2 sudo[217205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:03:35 compute-2 podman[217184]: 2025-10-10 10:03:35.360619961 +0000 UTC m=+0.051587899 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 10 10:03:35 compute-2 sudo[217205]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:35 compute-2 multipathd[217177]: + [[ ! -n '' ]]
Oct 10 10:03:35 compute-2 multipathd[217177]: + . kolla_extend_start
Oct 10 10:03:35 compute-2 multipathd[217177]: Running command: '/usr/sbin/multipathd -d'
Oct 10 10:03:35 compute-2 multipathd[217177]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 10 10:03:35 compute-2 multipathd[217177]: + umask 0022
Oct 10 10:03:35 compute-2 multipathd[217177]: + exec /usr/sbin/multipathd -d
Oct 10 10:03:35 compute-2 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 10:03:35 compute-2 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.service: Failed with result 'exit-code'.
Oct 10 10:03:35 compute-2 multipathd[217177]: 3518.991339 | --------start up--------
Oct 10 10:03:35 compute-2 multipathd[217177]: 3518.991354 | read /etc/multipath.conf
Oct 10 10:03:35 compute-2 multipathd[217177]: 3518.996783 | path checkers start up
Oct 10 10:03:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:36 compute-2 python3.9[217366]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:03:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:37 compute-2 sudo[217520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agjxjryrevmmvyxurmrqaubvcttdqwks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090616.9460075-2585-229722563683018/AnsiballZ_command.py'
Oct 10 10:03:37 compute-2 sudo[217520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:37.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:37 compute-2 python3.9[217522]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:03:37 compute-2 sudo[217520]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:03:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:03:37 compute-2 ceph-mon[74913]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:03:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:38 compute-2 sudo[217685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkhwcglfuijmekgbaukbyatlpguvnkxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090617.8451324-2608-104302703095533/AnsiballZ_systemd.py'
Oct 10 10:03:38 compute-2 sudo[217685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:38 compute-2 python3.9[217687]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:03:38 compute-2 systemd[1]: Stopping multipathd container...
Oct 10 10:03:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:38 compute-2 multipathd[217177]: 3522.179430 | exit (signal)
Oct 10 10:03:38 compute-2 multipathd[217177]: 3522.179485 | --------shut down-------
Oct 10 10:03:38 compute-2 systemd[1]: libpod-3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.scope: Deactivated successfully.
Oct 10 10:03:38 compute-2 podman[217691]: 2025-10-10 10:03:38.599226989 +0000 UTC m=+0.080461141 container died 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:03:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:38 compute-2 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.timer: Deactivated successfully.
Oct 10 10:03:38 compute-2 systemd[1]: Stopped /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 10:03:38 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-userdata-shm.mount: Deactivated successfully.
Oct 10 10:03:38 compute-2 systemd[1]: var-lib-containers-storage-overlay-7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5-merged.mount: Deactivated successfully.
Oct 10 10:03:38 compute-2 podman[217708]: 2025-10-10 10:03:38.700208394 +0000 UTC m=+0.071615639 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 10:03:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:38 compute-2 podman[217691]: 2025-10-10 10:03:38.975629 +0000 UTC m=+0.456863142 container cleanup 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 10 10:03:38 compute-2 podman[217691]: multipathd
Oct 10 10:03:39 compute-2 podman[217742]: multipathd
Oct 10 10:03:39 compute-2 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 10 10:03:39 compute-2 systemd[1]: Stopped multipathd container.
Oct 10 10:03:39 compute-2 systemd[1]: Starting multipathd container...
Oct 10 10:03:39 compute-2 podman[217743]: 2025-10-10 10:03:39.118148342 +0000 UTC m=+0.101367278 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:03:39 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:03:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 10:03:39 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:03:39 compute-2 systemd[1]: Started /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 10:03:39 compute-2 podman[217767]: 2025-10-10 10:03:39.215805541 +0000 UTC m=+0.121609515 container init 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 10:03:39 compute-2 multipathd[217793]: + sudo -E kolla_set_configs
Oct 10 10:03:39 compute-2 sudo[217799]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 10 10:03:39 compute-2 sudo[217799]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 10 10:03:39 compute-2 sudo[217799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:03:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:39.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:39 compute-2 podman[217767]: 2025-10-10 10:03:39.244456716 +0000 UTC m=+0.150260670 container start 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 10 10:03:39 compute-2 podman[217767]: multipathd
Oct 10 10:03:39 compute-2 systemd[1]: Started multipathd container.
Oct 10 10:03:39 compute-2 multipathd[217793]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 10:03:39 compute-2 multipathd[217793]: INFO:__main__:Validating config file
Oct 10 10:03:39 compute-2 multipathd[217793]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 10:03:39 compute-2 multipathd[217793]: INFO:__main__:Writing out command to execute
Oct 10 10:03:39 compute-2 sudo[217799]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:39 compute-2 multipathd[217793]: ++ cat /run_command
Oct 10 10:03:39 compute-2 multipathd[217793]: + CMD='/usr/sbin/multipathd -d'
Oct 10 10:03:39 compute-2 multipathd[217793]: + ARGS=
Oct 10 10:03:39 compute-2 multipathd[217793]: + sudo kolla_copy_cacerts
Oct 10 10:03:39 compute-2 sudo[217685]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:39 compute-2 sudo[217822]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 10 10:03:39 compute-2 sudo[217822]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Oct 10 10:03:39 compute-2 sudo[217822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 10 10:03:39 compute-2 sudo[217822]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:39 compute-2 multipathd[217793]: + [[ ! -n '' ]]
Oct 10 10:03:39 compute-2 multipathd[217793]: + . kolla_extend_start
Oct 10 10:03:39 compute-2 multipathd[217793]: Running command: '/usr/sbin/multipathd -d'
Oct 10 10:03:39 compute-2 multipathd[217793]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 10 10:03:39 compute-2 multipathd[217793]: + umask 0022
Oct 10 10:03:39 compute-2 multipathd[217793]: + exec /usr/sbin/multipathd -d
Oct 10 10:03:39 compute-2 podman[217800]: 2025-10-10 10:03:39.31594812 +0000 UTC m=+0.063461539 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:03:39 compute-2 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-52b87c31bcfd2ac1.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 10:03:39 compute-2 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-52b87c31bcfd2ac1.service: Failed with result 'exit-code'.
Oct 10 10:03:39 compute-2 multipathd[217793]: 3522.939492 | --------start up--------
Oct 10 10:03:39 compute-2 multipathd[217793]: 3522.939513 | read /etc/multipath.conf
Oct 10 10:03:39 compute-2 multipathd[217793]: 3522.944947 | path checkers start up
Oct 10 10:03:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:39 compute-2 sudo[217985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nydavowgbndqnszcckrfldrqxndweezz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090619.5002146-2633-203524261913065/AnsiballZ_file.py'
Oct 10 10:03:39 compute-2 sudo[217985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:39.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:39 compute-2 ceph-mon[74913]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Oct 10 10:03:40 compute-2 python3.9[217987]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:40 compute-2 sudo[217985]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:40 compute-2 sudo[218139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrhgcujbjixifvniavkouzrqofnurxwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090620.6351755-2670-65121756732599/AnsiballZ_file.py'
Oct 10 10:03:40 compute-2 sudo[218139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:41 compute-2 python3.9[218141]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 10:03:41 compute-2 sudo[218139]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:03:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.454 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:03:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.454 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:03:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:41 compute-2 sudo[218291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlxshvrvvtrzlvzsbczskspuocznqyjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090621.4467251-2692-243652337981323/AnsiballZ_modprobe.py'
Oct 10 10:03:41 compute-2 sudo[218291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:41 compute-2 python3.9[218293]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 10 10:03:41 compute-2 kernel: Key type psk registered
Oct 10 10:03:41 compute-2 sudo[218291]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:42 compute-2 ceph-mon[74913]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:03:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:42 compute-2 sudo[218455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sumqxrlfwjnukynrcrcqxxbvjkalqicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090622.3625395-2716-97335423636636/AnsiballZ_stat.py'
Oct 10 10:03:42 compute-2 sudo[218455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:42 compute-2 python3.9[218457]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:03:42 compute-2 sudo[218455]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:43 compute-2 sudo[218579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zasscvxpasvftbbbticdvighzotuviwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090622.3625395-2716-97335423636636/AnsiballZ_copy.py'
Oct 10 10:03:43 compute-2 sudo[218579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:43 compute-2 python3.9[218581]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090622.3625395-2716-97335423636636/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:43 compute-2 sudo[218579]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:43.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:44 compute-2 ceph-mon[74913]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:03:44 compute-2 sudo[218731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biykbzsmavupnmkdcodoikgncbjcjyxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090623.9449265-2765-31931832313254/AnsiballZ_lineinfile.py'
Oct 10 10:03:44 compute-2 sudo[218731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:44 compute-2 python3.9[218733]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:44 compute-2 sudo[218731]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004190 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:45 compute-2 sudo[218885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmfqrsekcqfurquszdbjkmgznlucbyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090624.7833967-2788-161736645038351/AnsiballZ_systemd.py'
Oct 10 10:03:45 compute-2 sudo[218885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:45.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:45 compute-2 python3.9[218887]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:03:45 compute-2 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 10 10:03:45 compute-2 systemd[1]: Stopped Load Kernel Modules.
Oct 10 10:03:45 compute-2 systemd[1]: Stopping Load Kernel Modules...
Oct 10 10:03:45 compute-2 systemd[1]: Starting Load Kernel Modules...
Oct 10 10:03:45 compute-2 systemd[1]: Finished Load Kernel Modules.
Oct 10 10:03:45 compute-2 sudo[218885]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:46 compute-2 ceph-mon[74913]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:03:46 compute-2 sudo[219041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjksgezjniuiuhylefavtsboksfflgnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090625.8932552-2812-104082925006562/AnsiballZ_setup.py'
Oct 10 10:03:46 compute-2 sudo[219041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:46 compute-2 python3.9[219043]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100346 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:46 compute-2 sudo[219041]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:03:47 compute-2 sudo[219138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjyotplxcqkkuevgfvniykduucfwokyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090625.8932552-2812-104082925006562/AnsiballZ_dnf.py'
Oct 10 10:03:47 compute-2 sudo[219138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:47 compute-2 podman[219101]: 2025-10-10 10:03:47.232570677 +0000 UTC m=+0.054951895 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 10 10:03:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:47.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:47 compute-2 python3.9[219147]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 10:03:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:47.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:48 compute-2 ceph-mon[74913]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:03:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8004380 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:49.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:03:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:49.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:03:50 compute-2 ceph-mon[74913]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:03:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8004380 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001c50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:51.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:51.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:52 compute-2 ceph-mon[74913]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:03:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:03:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:53.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:03:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:53 compute-2 systemd[1]: Reloading.
Oct 10 10:03:53 compute-2 systemd-rc-local-generator[219186]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:53 compute-2 systemd-sysv-generator[219190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:53.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:54 compute-2 systemd[1]: Reloading.
Oct 10 10:03:54 compute-2 systemd-rc-local-generator[219224]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:54 compute-2 systemd-sysv-generator[219228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:54 compute-2 ceph-mon[74913]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:03:54 compute-2 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 10 10:03:54 compute-2 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 10 10:03:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001c50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:54 compute-2 lvm[219268]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 10:03:54 compute-2 lvm[219268]: VG ceph_vg0 finished
Oct 10 10:03:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:54 compute-2 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 10:03:54 compute-2 systemd[1]: Starting man-db-cache-update.service...
Oct 10 10:03:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:03:54 compute-2 systemd[1]: Reloading.
Oct 10 10:03:54 compute-2 kernel: ganesha.nfsd[219154]: segfault at 50 ip 00007f8dd2c3532e sp 00007f8d9dffa210 error 4 in libntirpc.so.5.8[7f8dd2c1a000+2c000] likely on CPU 4 (core 0, socket 4)
Oct 10 10:03:54 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:03:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004210 fd 42 proxy ignored for local
Oct 10 10:03:54 compute-2 systemd-sysv-generator[219327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:03:54 compute-2 systemd-rc-local-generator[219324]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:03:55 compute-2 systemd[1]: Started Process Core Dump (PID 219297/UID 0).
Oct 10 10:03:55 compute-2 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 10:03:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:03:55 compute-2 sudo[219583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:03:55 compute-2 sudo[219583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:03:55 compute-2 sudo[219583]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:55.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:55 compute-2 sudo[219138]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:55.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:56 compute-2 ceph-mon[74913]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:03:56 compute-2 systemd-coredump[219535]: Process 187769 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 67:
                                                    #0  0x00007f8dd2c3532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:03:56 compute-2 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 10:03:56 compute-2 systemd[1]: Finished man-db-cache-update.service.
Oct 10 10:03:56 compute-2 systemd[1]: man-db-cache-update.service: Consumed 1.522s CPU time.
Oct 10 10:03:56 compute-2 systemd[1]: run-r047650fce260480c8926b7504c6982d4.service: Deactivated successfully.
Oct 10 10:03:56 compute-2 systemd[1]: systemd-coredump@6-219297-0.service: Deactivated successfully.
Oct 10 10:03:56 compute-2 systemd[1]: systemd-coredump@6-219297-0.service: Consumed 1.110s CPU time.
Oct 10 10:03:56 compute-2 podman[220517]: 2025-10-10 10:03:56.335061921 +0000 UTC m=+0.025454694 container died d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Oct 10 10:03:56 compute-2 systemd[1]: var-lib-containers-storage-overlay-3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df-merged.mount: Deactivated successfully.
Oct 10 10:03:56 compute-2 podman[220517]: 2025-10-10 10:03:56.454235907 +0000 UTC m=+0.144628660 container remove d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 10:03:56 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:03:56 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:03:56 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.635s CPU time.
Oct 10 10:03:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:57 compute-2 sudo[220688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucwwxepzkjegtlehlvuyalqgqltayyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090636.9121335-2849-74411606455737/AnsiballZ_file.py'
Oct 10 10:03:57 compute-2 sudo[220688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:57 compute-2 ceph-mon[74913]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:03:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:57 compute-2 python3.9[220690]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:57 compute-2 sudo[220688]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:58 compute-2 python3.9[220840]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 10:03:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:59 compute-2 sudo[220996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wftwersaybytwvdgtgvsygqcuudwtrgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090638.8265533-2901-207094345968731/AnsiballZ_file.py'
Oct 10 10:03:59 compute-2 sudo[220996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:03:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:03:59 compute-2 python3.9[220998]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:03:59 compute-2 sudo[220996]: pam_unix(sudo:session): session closed for user root
Oct 10 10:03:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:03:59 compute-2 ceph-mon[74913]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:03:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:03:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:03:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:59.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:00 compute-2 sudo[221149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gclldhwltuqksayboadcubkdzmurvjsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090639.9432364-2935-144844968691614/AnsiballZ_systemd_service.py'
Oct 10 10:04:00 compute-2 sudo[221149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100400 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 2ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:04:00 compute-2 python3.9[221151]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 10:04:00 compute-2 systemd[1]: Reloading.
Oct 10 10:04:01 compute-2 systemd-rc-local-generator[221179]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:04:01 compute-2 systemd-sysv-generator[221182]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:04:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:01 compute-2 sudo[221149]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:01 compute-2 ceph-mon[74913]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:04:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:04:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:01.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:02 compute-2 python3.9[221337]: ansible-ansible.builtin.service_facts Invoked
Oct 10 10:04:02 compute-2 network[221354]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 10:04:02 compute-2 network[221355]: 'network-scripts' will be removed from distribution in near future.
Oct 10 10:04:02 compute-2 network[221356]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 10:04:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:03 compute-2 ceph-mon[74913]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 10:04:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:03.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:05.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:05 compute-2 ceph-mon[74913]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:04:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:05.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:06 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 7.
Oct 10 10:04:06 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:04:06 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.635s CPU time.
Oct 10 10:04:06 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:04:06 compute-2 podman[221525]: 2025-10-10 10:04:06.861023148 +0000 UTC m=+0.042847910 container create 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1)
Oct 10 10:04:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:04:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:04:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:04:06 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:04:06 compute-2 podman[221525]: 2025-10-10 10:04:06.923915316 +0000 UTC m=+0.105740068 container init 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Oct 10 10:04:06 compute-2 podman[221525]: 2025-10-10 10:04:06.839327235 +0000 UTC m=+0.021152017 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:04:06 compute-2 podman[221525]: 2025-10-10 10:04:06.938299456 +0000 UTC m=+0.120124208 container start 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 10:04:06 compute-2 bash[221525]: 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1
Oct 10 10:04:06 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:04:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:04:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:07 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:04:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:07 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:04:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:07 compute-2 ceph-mon[74913]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:04:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:07.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:08 compute-2 sudo[221742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icbvzehynxphhemvemxmilpgcbwzlxgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090648.6991007-2991-27643259041390/AnsiballZ_systemd_service.py'
Oct 10 10:04:08 compute-2 sudo[221742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:09 compute-2 podman[221744]: 2025-10-10 10:04:09.086767616 +0000 UTC m=+0.083155998 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 10:04:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:09 compute-2 python3.9[221745]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:09 compute-2 sudo[221742]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:09 compute-2 podman[221768]: 2025-10-10 10:04:09.424079889 +0000 UTC m=+0.091107441 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:04:09 compute-2 podman[221769]: 2025-10-10 10:04:09.431631411 +0000 UTC m=+0.094298874 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 10:04:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:09 compute-2 sudo[221961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcvhtqrghbkgyiovkebzbekcdwfrqeyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090649.4768894-2991-2714072195323/AnsiballZ_systemd_service.py'
Oct 10 10:04:09 compute-2 sudo[221961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:09 compute-2 ceph-mon[74913]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:04:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:09.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:10 compute-2 python3.9[221963]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:10 compute-2 sudo[221961]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:10 compute-2 sudo[222114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnmtfinzxdktltwoaieydpjskvjgslzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090650.2212107-2991-108381264135659/AnsiballZ_systemd_service.py'
Oct 10 10:04:10 compute-2 sudo[222114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:10 compute-2 python3.9[222116]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:10 compute-2 sudo[222114]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:11 compute-2 sudo[222269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otjnsmpjhrmwxihfnjuzsyoitzjwksgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090650.964674-2991-190840026929066/AnsiballZ_systemd_service.py'
Oct 10 10:04:11 compute-2 sudo[222269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:11.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:11 compute-2 python3.9[222271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:11 compute-2 sudo[222269]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:11 compute-2 ceph-mon[74913]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:04:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:11.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:11 compute-2 sudo[222422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icdnwnkatqlqzjsmjrtapwdifsijsbuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090651.6706438-2991-116080983068350/AnsiballZ_systemd_service.py'
Oct 10 10:04:11 compute-2 sudo[222422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:12 compute-2 python3.9[222424]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:12 compute-2 sudo[222422]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:12 compute-2 sudo[222576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxuxzqhufgmluwqiqybyakfyztymprl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090652.3917687-2991-199860441414433/AnsiballZ_systemd_service.py'
Oct 10 10:04:12 compute-2 sudo[222576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:12 compute-2 python3.9[222578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:12 compute-2 sudo[222576]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:13 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:04:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:13 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:04:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:13.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:13 compute-2 sudo[222730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaetgkrdmoahrpastiifiizrgastuuny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090653.1151783-2991-116918619061734/AnsiballZ_systemd_service.py'
Oct 10 10:04:13 compute-2 sudo[222730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:13 compute-2 python3.9[222732]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:13 compute-2 ceph-mon[74913]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:04:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:13.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:14 compute-2 sudo[222730]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:15 compute-2 sudo[222885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxvvtiqlficqalirvyzltxwswywrxhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090654.862886-2991-172796192433252/AnsiballZ_systemd_service.py'
Oct 10 10:04:15 compute-2 sudo[222885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:15 compute-2 sudo[222888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:04:15 compute-2 sudo[222888]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:15 compute-2 sudo[222888]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:15 compute-2 python3.9[222887]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:04:15 compute-2 sudo[222885]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:15 compute-2 ceph-mon[74913]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:04:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:15.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:16 compute-2 sudo[223063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gocdpcltbywtynfzuxmjgwacdysyxmae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090656.119408-3167-150686681641992/AnsiballZ_file.py'
Oct 10 10:04:16 compute-2 sudo[223063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:16 compute-2 python3.9[223065]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:16 compute-2 sudo[223063]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:04:17 compute-2 sudo[223217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igqwdjnjlvtzmrutvgnjzhsejokhadri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090656.741735-3167-12100606356515/AnsiballZ_file.py'
Oct 10 10:04:17 compute-2 sudo[223217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:17 compute-2 python3.9[223219]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:17 compute-2 sudo[223217]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:04:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:04:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:17 compute-2 podman[223343]: 2025-10-10 10:04:17.689310661 +0000 UTC m=+0.063879301 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 10 10:04:17 compute-2 sudo[223388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzsrulfzikkayjvemdjiymugzabroljy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090657.3642282-3167-262343159161791/AnsiballZ_file.py'
Oct 10 10:04:17 compute-2 sudo[223388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:17 compute-2 ceph-mon[74913]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Oct 10 10:04:17 compute-2 python3.9[223390]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:17 compute-2 sudo[223388]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:04:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:17.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:04:18 compute-2 sudo[223540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlzldggnjdfexfwtmgpdphltqntarskg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090658.0545034-3167-243180204457724/AnsiballZ_file.py'
Oct 10 10:04:18 compute-2 sudo[223540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:18 compute-2 python3.9[223542]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:18 compute-2 sudo[223540]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:18 compute-2 sudo[223694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xafsgioafydhpwtmkfxpnptjkxxschys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090658.692164-3167-73592164639808/AnsiballZ_file.py'
Oct 10 10:04:18 compute-2 sudo[223694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:04:19 compute-2 python3.9[223696]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:19 compute-2 sudo[223694]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:19 compute-2 sudo[223858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhhyqrnrjsxlbwgzeotmulpgwmhcnfxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090659.340568-3167-55511292398656/AnsiballZ_file.py'
Oct 10 10:04:19 compute-2 sudo[223858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:19 compute-2 python3.9[223860]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:19 compute-2 sudo[223858]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:19 compute-2 ceph-mon[74913]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:04:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:19.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:20 compute-2 sudo[224010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pibyemhraifxhuifmgnrlchgwoaveyej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090659.9450915-3167-238479146391127/AnsiballZ_file.py'
Oct 10 10:04:20 compute-2 sudo[224010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:20 compute-2 python3.9[224012]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:20 compute-2 sudo[224010]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa86c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:20 compute-2 sudo[224167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoslmggmphmkwwoudplsvekzutvfpign ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090660.558118-3167-36719641908494/AnsiballZ_file.py'
Oct 10 10:04:20 compute-2 sudo[224167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:21 compute-2 python3.9[224169]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:21 compute-2 sudo[224167]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:21.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:21 compute-2 ceph-mon[74913]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:04:22 compute-2 sudo[224319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoagfdjknnltlvpbnqggsngprigeynbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090662.0717921-3338-222309616963040/AnsiballZ_file.py'
Oct 10 10:04:22 compute-2 sudo[224319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:22 compute-2 python3.9[224321]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:22 compute-2 sudo[224319]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100422 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:04:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:23 compute-2 sudo[224473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhvlynegbplpppgrxrjznrdxwfpoteyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090662.750253-3338-161670267251061/AnsiballZ_file.py'
Oct 10 10:04:23 compute-2 sudo[224473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:23 compute-2 python3.9[224475]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:23 compute-2 sudo[224473]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:23.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:23 compute-2 sudo[224625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeurrytohauyauizfijcszgoxrzjihiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090663.3888254-3338-80191401447721/AnsiballZ_file.py'
Oct 10 10:04:23 compute-2 sudo[224625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:23 compute-2 python3.9[224627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:23 compute-2 sudo[224625]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:23.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:23 compute-2 ceph-mon[74913]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:04:24 compute-2 sudo[224777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fusetzldchjnoevhnmgxwypuxtgxcxke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090663.9733717-3338-6547910926030/AnsiballZ_file.py'
Oct 10 10:04:24 compute-2 sudo[224777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:24 compute-2 python3.9[224779]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:24 compute-2 sudo[224777]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:24 compute-2 sudo[224930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xknvufkwnojjsasdkcepbalwgshthekr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090664.5184464-3338-237534994426246/AnsiballZ_file.py'
Oct 10 10:04:24 compute-2 sudo[224930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:24 compute-2 python3.9[224932]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:24 compute-2 sudo[224930]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:04:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:25.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:04:25 compute-2 sudo[225083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wshtsuphuhdooghllyanhacjvixhhjgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090665.084057-3338-124089371595971/AnsiballZ_file.py'
Oct 10 10:04:25 compute-2 sudo[225083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:25 compute-2 python3.9[225085]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:25 compute-2 sudo[225083]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:25 compute-2 sudo[225235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odmkxtqhvrxqnyscjrryvmhskfarbtvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090665.652405-3338-181680951972321/AnsiballZ_file.py'
Oct 10 10:04:25 compute-2 sudo[225235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:25 compute-2 ceph-mon[74913]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:04:25 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Oct 10 10:04:25 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:25.997193) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:04:25 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Oct 10 10:04:25 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090665997265, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 935, "num_deletes": 251, "total_data_size": 2010408, "memory_usage": 2039184, "flush_reason": "Manual Compaction"}
Oct 10 10:04:25 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666012198, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1327534, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19913, "largest_seqno": 20842, "table_properties": {"data_size": 1323321, "index_size": 1929, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9372, "raw_average_key_size": 19, "raw_value_size": 1314853, "raw_average_value_size": 2727, "num_data_blocks": 86, "num_entries": 482, "num_filter_entries": 482, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090593, "oldest_key_time": 1760090593, "file_creation_time": 1760090665, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 15059 microseconds, and 6369 cpu microseconds.
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.012260) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1327534 bytes OK
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.012285) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014497) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014518) EVENT_LOG_v1 {"time_micros": 1760090666014512, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014535) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2005736, prev total WAL file size 2005736, number of live WAL files 2.
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.015163) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1296KB)], [36(13MB)]
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666015231, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15030296, "oldest_snapshot_seqno": -1}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4970 keys, 12865378 bytes, temperature: kUnknown
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666092334, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 12865378, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12831358, "index_size": 20470, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126819, "raw_average_key_size": 25, "raw_value_size": 12740312, "raw_average_value_size": 2563, "num_data_blocks": 839, "num_entries": 4970, "num_filter_entries": 4970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090666, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.092544) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 12865378 bytes
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.094789) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.8 rd, 166.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 13.1 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(21.0) write-amplify(9.7) OK, records in: 5486, records dropped: 516 output_compression: NoCompression
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.094805) EVENT_LOG_v1 {"time_micros": 1760090666094797, "job": 20, "event": "compaction_finished", "compaction_time_micros": 77163, "compaction_time_cpu_micros": 22472, "output_level": 6, "num_output_files": 1, "total_output_size": 12865378, "num_input_records": 5486, "num_output_records": 4970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666095092, "job": 20, "event": "table_file_deletion", "file_number": 38}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666097278, "job": 20, "event": "table_file_deletion", "file_number": 36}
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.015067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:04:26 compute-2 python3.9[225237]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:26 compute-2 sudo[225235]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:26 compute-2 sudo[225388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdgbacqlfujlsusqvwcqlzzgudadbdro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090666.2936573-3338-259571663797641/AnsiballZ_file.py'
Oct 10 10:04:26 compute-2 sudo[225388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:26 compute-2 python3.9[225390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:04:26 compute-2 sudo[225388]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:28 compute-2 ceph-mon[74913]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:04:28 compute-2 sudo[225541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rypkscfrrsesblmtkwktsqsevpcqxmxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090667.9932468-3513-268115441632918/AnsiballZ_command.py'
Oct 10 10:04:28 compute-2 sudo[225541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:28 compute-2 python3.9[225543]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:28 compute-2 sudo[225541]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:29.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:29 compute-2 python3.9[225697]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 10:04:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:30 compute-2 ceph-mon[74913]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:04:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:30 compute-2 sudo[225847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-higoyaywtkyqnwfxayuvezhbldfgqvsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090669.9972272-3567-41175359199051/AnsiballZ_systemd_service.py'
Oct 10 10:04:30 compute-2 sudo[225847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:30 compute-2 python3.9[225849]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 10:04:30 compute-2 systemd[1]: Reloading.
Oct 10 10:04:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:30 compute-2 systemd-rc-local-generator[225875]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:04:30 compute-2 systemd-sysv-generator[225879]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:04:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:30 compute-2 sudo[225847]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:31 compute-2 sudo[226036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csrczekhadyrvcxqxkdlupjwwemxsebj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090671.2874315-3591-90497516110178/AnsiballZ_command.py'
Oct 10 10:04:31 compute-2 sudo[226036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:31 compute-2 python3.9[226038]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:31 compute-2 sudo[226036]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:32 compute-2 ceph-mon[74913]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:04:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:04:32 compute-2 sudo[226189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wstaybajepjpohbwbknjytamubtatmcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090671.9432256-3591-273083611048647/AnsiballZ_command.py'
Oct 10 10:04:32 compute-2 sudo[226189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:32 compute-2 python3.9[226191]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:32 compute-2 sudo[226189]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:32 compute-2 sudo[226192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:04:32 compute-2 sudo[226192]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:32 compute-2 sudo[226192]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:32 compute-2 sudo[226222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:04:32 compute-2 sudo[226222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:32 compute-2 sudo[226407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltuxikukintlgcdujmibiohjjpbdmxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090672.5576954-3591-32821002236032/AnsiballZ_command.py'
Oct 10 10:04:32 compute-2 sudo[226407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:32 compute-2 python3.9[226410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:32 compute-2 sudo[226222]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:33 compute-2 sudo[226407]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:33.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:33 compute-2 sudo[226578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfwomoadjkyoojertjbzfqnhfafqmnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090673.1360114-3591-9981929174895/AnsiballZ_command.py'
Oct 10 10:04:33 compute-2 sudo[226578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:33 compute-2 python3.9[226580]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:33 compute-2 sudo[226578]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100433 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:04:33 compute-2 ceph-mon[74913]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:04:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:04:33 compute-2 sudo[226731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbvjuvbljhnvduvtxwukssnnjijohgur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090673.6896331-3591-4019135634879/AnsiballZ_command.py'
Oct 10 10:04:33 compute-2 sudo[226731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:04:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:04:34 compute-2 python3.9[226733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:34 compute-2 sudo[226731]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100434 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:34 compute-2 sudo[226885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfarytyuwhgzajcsgthhbfjlnxtqeyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090674.370489-3591-269652775075698/AnsiballZ_command.py'
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:34 compute-2 sudo[226885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:34 compute-2 python3.9[226887]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:34 compute-2 sudo[226885]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:04:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:04:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:04:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:35 compute-2 sudo[227039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbusdhingmhjuwdqlojfrvaowdgykqbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090674.953468-3591-121100603098679/AnsiballZ_command.py'
Oct 10 10:04:35 compute-2 sudo[227039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:35.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:35 compute-2 sudo[227042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:04:35 compute-2 sudo[227042]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:35 compute-2 sudo[227042]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:35 compute-2 python3.9[227041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:35 compute-2 sudo[227039]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:35 compute-2 sudo[227217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxxrlyzryfqryxnccsnozicmbqkelqdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090675.5898075-3591-181301276737160/AnsiballZ_command.py'
Oct 10 10:04:35 compute-2 sudo[227217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:35 compute-2 ceph-mon[74913]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:04:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:36 compute-2 python3.9[227219]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 10:04:36 compute-2 sudo[227217]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:37.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:37 compute-2 ceph-mon[74913]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:04:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:38 compute-2 sudo[227373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpihenaqhoyackbgcikhuvczskensgjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090678.3058836-3798-256514953398911/AnsiballZ_file.py'
Oct 10 10:04:38 compute-2 sudo[227373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:38 compute-2 python3.9[227375]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:38 compute-2 sudo[227373]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:39 compute-2 sudo[227476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:04:39 compute-2 sudo[227476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:39 compute-2 sudo[227476]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:39 compute-2 sudo[227566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkjxsqsxlmwogjeegybcrpkaplurqvtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090678.9297168-3798-195469753835815/AnsiballZ_file.py'
Oct 10 10:04:39 compute-2 sudo[227566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:39 compute-2 podman[227523]: 2025-10-10 10:04:39.224128259 +0000 UTC m=+0.068142567 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:04:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:39 compute-2 python3.9[227572]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:39 compute-2 sudo[227566]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:39 compute-2 podman[227574]: 2025-10-10 10:04:39.543798069 +0000 UTC m=+0.061738042 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Oct 10 10:04:39 compute-2 podman[227575]: 2025-10-10 10:04:39.565969337 +0000 UTC m=+0.083498298 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:04:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:39 compute-2 sudo[227770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yswdqngefbgmwrlvwuayrgzjuuqkoqjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090679.6123672-3798-37606395992129/AnsiballZ_file.py'
Oct 10 10:04:39 compute-2 sudo[227770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:39 compute-2 ceph-mon[74913]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:04:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:04:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:40 compute-2 python3.9[227772]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:40 compute-2 sudo[227770]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:40 compute-2 sudo[227924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtbheygmzrlckrcssypdjblpcujfhahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090680.593205-3864-18174290658437/AnsiballZ_file.py'
Oct 10 10:04:40 compute-2 sudo[227924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:41 compute-2 python3.9[227926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:41 compute-2 sudo[227924]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:41.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.455 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:04:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:04:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:04:41 compute-2 sudo[228076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvpbjmnpowriqmtuqmujdqcxssjlsawe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090681.3144522-3864-6842865934346/AnsiballZ_file.py'
Oct 10 10:04:41 compute-2 sudo[228076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:41 compute-2 python3.9[228078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:41 compute-2 sudo[228076]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:41 compute-2 ceph-mon[74913]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Oct 10 10:04:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:42 compute-2 sudo[228228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmeymxcgltdpcspwkaovybyvksroste ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090681.9722202-3864-266295693349961/AnsiballZ_file.py'
Oct 10 10:04:42 compute-2 sudo[228228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:42 compute-2 python3.9[228230]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:42 compute-2 sudo[228228]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:42 compute-2 sudo[228382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpzqmlmkhybcobyiflknkhxuabbnyttc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090682.5874279-3864-164069818408937/AnsiballZ_file.py'
Oct 10 10:04:42 compute-2 sudo[228382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:43 compute-2 python3.9[228384]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:43 compute-2 sudo[228382]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:43 compute-2 sudo[228534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xecwubpucyjfxjamodkhlrncqrjhryvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090683.171549-3864-233899671763380/AnsiballZ_file.py'
Oct 10 10:04:43 compute-2 sudo[228534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:43 compute-2 python3.9[228536]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:43 compute-2 sudo[228534]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:43 compute-2 ceph-mon[74913]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:04:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:44 compute-2 sudo[228686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arwrblmgldwbkhmobhifihyyvhcqxkvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090683.8021557-3864-108973724374430/AnsiballZ_file.py'
Oct 10 10:04:44 compute-2 sudo[228686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:44 compute-2 python3.9[228688]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:44 compute-2 sudo[228686]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:44 compute-2 sudo[228839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvgkndafbmtudygchijldnfgdnqwvbpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090684.3871126-3864-271139198737822/AnsiballZ_file.py'
Oct 10 10:04:44 compute-2 sudo[228839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:44 compute-2 python3.9[228841]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:44 compute-2 sudo[228839]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:45 compute-2 sudo[228992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmillozqgeivumaopeohhcyvtwygpcor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090684.9745138-3864-261198291639430/AnsiballZ_file.py'
Oct 10 10:04:45 compute-2 sudo[228992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:45 compute-2 python3.9[228994]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:45 compute-2 sudo[228992]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:04:45 compute-2 sudo[229144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quzfsmvljkfcvqtdmprgucxtrsvjpjkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090685.6207814-3864-113072345485205/AnsiballZ_file.py'
Oct 10 10:04:45 compute-2 sudo[229144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:45 compute-2 ceph-mon[74913]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:04:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:46 compute-2 python3.9[229146]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:46 compute-2 sudo[229144]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:04:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:47 compute-2 ceph-mon[74913]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:04:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:47.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:48 compute-2 podman[229174]: 2025-10-10 10:04:48.816144057 +0000 UTC m=+0.084476759 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 10:04:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:49 compute-2 ceph-mon[74913]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:04:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:49.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:51.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:51.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:51 compute-2 ceph-mon[74913]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:04:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:52 compute-2 sudo[229325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npkgnwtxaqwtqflpzussgvvotinqbcrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090692.451571-4231-140578694976241/AnsiballZ_getent.py'
Oct 10 10:04:52 compute-2 sudo[229325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:53 compute-2 python3.9[229327]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 10 10:04:53 compute-2 sudo[229325]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100453 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:04:53 compute-2 sudo[229478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woggmeprikmqvfyntrymzuohkynpvukj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090693.390352-4254-117788625519318/AnsiballZ_group.py'
Oct 10 10:04:53 compute-2 sudo[229478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:53 compute-2 ceph-mon[74913]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Oct 10 10:04:54 compute-2 python3.9[229480]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 10:04:54 compute-2 groupadd[229481]: group added to /etc/group: name=nova, GID=42436
Oct 10 10:04:54 compute-2 groupadd[229481]: group added to /etc/gshadow: name=nova
Oct 10 10:04:54 compute-2 groupadd[229481]: new group: name=nova, GID=42436
Oct 10 10:04:54 compute-2 sudo[229478]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100454 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:55 compute-2 sudo[229638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfsthqxdwfifocbustktuywqolxuyfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090694.4405556-4279-89096784084745/AnsiballZ_user.py'
Oct 10 10:04:55 compute-2 sudo[229638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:04:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:04:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:04:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:04:55 compute-2 python3.9[229640]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 10:04:55 compute-2 useradd[229642]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 10 10:04:55 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:04:55 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:04:55 compute-2 useradd[229642]: add 'nova' to group 'libvirt'
Oct 10 10:04:55 compute-2 useradd[229642]: add 'nova' to shadow group 'libvirt'
Oct 10 10:04:55 compute-2 sudo[229638]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:55 compute-2 sudo[229650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:04:55 compute-2 sudo[229650]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:04:55 compute-2 sudo[229650]: pam_unix(sudo:session): session closed for user root
Oct 10 10:04:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:55.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:56 compute-2 ceph-mon[74913]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:04:56 compute-2 sshd-session[229699]: Accepted publickey for zuul from 192.168.122.30 port 36432 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:04:56 compute-2 systemd-logind[796]: New session 56 of user zuul.
Oct 10 10:04:56 compute-2 systemd[1]: Started Session 56 of User zuul.
Oct 10 10:04:56 compute-2 sshd-session[229699]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:04:56 compute-2 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 10:04:56 compute-2 sshd-session[229702]: Received disconnect from 192.168.122.30 port 36432:11: disconnected by user
Oct 10 10:04:56 compute-2 sshd-session[229702]: Disconnected from user zuul 192.168.122.30 port 36432
Oct 10 10:04:56 compute-2 sshd-session[229699]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:04:56 compute-2 systemd[1]: session-56.scope: Deactivated successfully.
Oct 10 10:04:56 compute-2 systemd-logind[796]: Session 56 logged out. Waiting for processes to exit.
Oct 10 10:04:56 compute-2 systemd-logind[796]: Removed session 56.
Oct 10 10:04:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:57 compute-2 python3.9[229854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:04:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:57 compute-2 python3.9[229975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090696.8281558-4354-184256217593510/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:58 compute-2 ceph-mon[74913]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:04:58 compute-2 python3.9[230125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:04:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:58 compute-2 python3.9[230201]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:04:59 compute-2 python3.9[230353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:04:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:04:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:59.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:04:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:04:59 compute-2 python3.9[230474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090698.8020444-4354-57649100125623/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:04:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:04:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:04:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:05:00 compute-2 ceph-mon[74913]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:05:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:00 compute-2 python3.9[230624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:05:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:01 compute-2 python3.9[230747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090700.17484-4354-213474519980429/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:05:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:01.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:01 compute-2 python3.9[230897]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:05:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:02 compute-2 ceph-mon[74913]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:05:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:05:02 compute-2 python3.9[231018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090701.4614491-4354-39984000135181/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:05:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:04 compute-2 ceph-mon[74913]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 170 B/s wr, 1 op/s
Oct 10 10:05:04 compute-2 sudo[231170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbriapuxmjaihlwmetptkhhuivngyyfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090703.9228632-4561-62298350569891/AnsiballZ_file.py'
Oct 10 10:05:04 compute-2 sudo[231170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:04 compute-2 python3.9[231172]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:05:04 compute-2 sudo[231170]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:04 compute-2 sudo[231324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xozfjdwnjnsqwjgpeecyizcybswnjqrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090704.714345-4585-211067475638718/AnsiballZ_copy.py'
Oct 10 10:05:04 compute-2 sudo[231324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:05 compute-2 python3.9[231326]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:05:05 compute-2 sudo[231324]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:05 compute-2 sudo[231476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmrsbevjvogmxphbdvcncxdngjppalvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090705.6016576-4609-188547112653891/AnsiballZ_stat.py'
Oct 10 10:05:05 compute-2 sudo[231476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:05.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:06 compute-2 python3.9[231478]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:06 compute-2 sudo[231476]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:06 compute-2 ceph-mon[74913]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:06 compute-2 sudo[231629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izhqyffqyxydmghaeoythgxnjkbnlxbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090706.400211-4633-271045842952785/AnsiballZ_stat.py'
Oct 10 10:05:06 compute-2 sudo[231629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:06 compute-2 python3.9[231631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:05:06 compute-2 sudo[231629]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:07 compute-2 sudo[231753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mezwhzjmtinutcpwqwmincukqkbnaytu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090706.400211-4633-271045842952785/AnsiballZ_copy.py'
Oct 10 10:05:07 compute-2 sudo[231753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:07 compute-2 python3.9[231755]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1760090706.400211-4633-271045842952785/.source _original_basename=.ktnnso3a follow=False checksum=19921791aaa0ec1498e105f7ca2c9a3a0c3d4795 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 10 10:05:07 compute-2 sudo[231753]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:08 compute-2 ceph-mon[74913]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:08 compute-2 python3.9[231907]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:09 compute-2 python3.9[232061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:05:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:09 compute-2 podman[232158]: 2025-10-10 10:05:09.789948147 +0000 UTC m=+0.054680408 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:05:09 compute-2 podman[232156]: 2025-10-10 10:05:09.791151755 +0000 UTC m=+0.065034658 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 10:05:09 compute-2 podman[232157]: 2025-10-10 10:05:09.817692313 +0000 UTC m=+0.085793202 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:05:09 compute-2 python3.9[232231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090708.9445035-4712-37263100983407/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=837ffd9c004e5987a2e117698c56827ebbfeb5b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:05:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:10.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:10 compute-2 ceph-mon[74913]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:10 compute-2 python3.9[232399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 10:05:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:11 compute-2 python3.9[232521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090710.332985-4756-88156178199187/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=722ab36345f3375cbdcf911ce8f6e1a8083d7e59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 10:05:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:12.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:12 compute-2 ceph-mon[74913]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:12 compute-2 sudo[232671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thuofoeemxqlhyqwwzliepdklnthxzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090712.0328486-4806-123085488346678/AnsiballZ_container_config_data.py'
Oct 10 10:05:12 compute-2 sudo[232671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:12 compute-2 python3.9[232673]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 10 10:05:12 compute-2 sudo[232671]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:13 compute-2 sudo[232825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jisbqvrerdzwwxjoydighhonuzlgqaft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090712.9313185-4834-96462303056967/AnsiballZ_container_config_hash.py'
Oct 10 10:05:13 compute-2 sudo[232825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:13.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:13 compute-2 python3.9[232827]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 10:05:13 compute-2 sudo[232825]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:14 compute-2 ceph-mon[74913]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:05:14 compute-2 sudo[232977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmqdhnucquboueyxkoxayyfdiioyzrar ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090713.905965-4863-36259276054365/AnsiballZ_edpm_container_manage.py'
Oct 10 10:05:14 compute-2 sudo[232977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:14 compute-2 python3[232979]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 10:05:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:15 compute-2 ceph-mon[74913]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:15.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:15 compute-2 sudo[233016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:05:15 compute-2 sudo[233016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:15 compute-2 sudo[233016]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:05:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:17.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:17 compute-2 ceph-mon[74913]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:18.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:19.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:05:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3941 writes, 21K keys, 3941 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                           Cumulative WAL: 3941 writes, 3941 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1443 writes, 6884 keys, 1443 commit groups, 1.0 writes per commit group, ingest: 16.49 MB, 0.03 MB/s
                                           Interval WAL: 1443 writes, 1443 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    132.0      0.24              0.08        10    0.024       0      0       0.0       0.0
                                             L6      1/0   12.27 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    168.2    142.7      0.78              0.28         9    0.087     43K   4823       0.0       0.0
                                            Sum      1/0   12.27 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    128.3    140.2      1.03              0.37        19    0.054     43K   4823       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.4    120.9    121.2      0.52              0.13         8    0.065     22K   2562       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    168.2    142.7      0.78              0.28         9    0.087     43K   4823       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    133.0      0.24              0.08         9    0.027       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.031, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.0 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 8.78 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 9.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(481,8.41 MB,2.76774%) FilterBlock(19,130.05 KB,0.041776%) IndexBlock(19,240.70 KB,0.0773229%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Oct 10 10:05:19 compute-2 podman[233063]: 2025-10-10 10:05:19.910721718 +0000 UTC m=+0.186692046 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:05:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:20.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:20 compute-2 ceph-mon[74913]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:21 compute-2 ceph-mon[74913]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:22.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:23 compute-2 ceph-mon[74913]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:05:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:24.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:25 compute-2 podman[232992]: 2025-10-10 10:05:25.062605743 +0000 UTC m=+10.593652548 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 10:05:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:25 compute-2 podman[233138]: 2025-10-10 10:05:25.173499911 +0000 UTC m=+0.021411925 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 10:05:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:26.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:26 compute-2 ceph-mon[74913]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:26 compute-2 podman[233138]: 2025-10-10 10:05:26.135621926 +0000 UTC m=+0.983533920 container create 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true)
Oct 10 10:05:26 compute-2 python3[232979]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 10 10:05:26 compute-2 sudo[232977]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:27 compute-2 sudo[233328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odexlitlwuoggcncqyyipqwuahewnixc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090726.783398-4888-218045861630237/AnsiballZ_stat.py'
Oct 10 10:05:27 compute-2 sudo[233328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:27 compute-2 python3.9[233330]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:27 compute-2 ceph-mon[74913]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:27.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:27 compute-2 sudo[233328]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:28 compute-2 sudo[233482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcjqhnzeynlakqrqfioummdixechtcve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090728.1136518-4924-114408959678996/AnsiballZ_container_config_data.py'
Oct 10 10:05:28 compute-2 sudo[233482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:28 compute-2 python3.9[233484]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 10 10:05:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:28 compute-2 sudo[233482]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0039e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:29 compute-2 sudo[233636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvpdwunrwojxggmxihkozvsmnikfrvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090729.0415611-4951-271518737761269/AnsiballZ_container_config_hash.py'
Oct 10 10:05:29 compute-2 sudo[233636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:29.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:29 compute-2 ceph-mon[74913]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:29 compute-2 python3.9[233638]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 10:05:29 compute-2 sudo[233636]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:30 compute-2 sudo[233789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxukrbjkjsunvfeqtskxutkkokokjzag ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1760090730.2165775-4981-266449665086511/AnsiballZ_edpm_container_manage.py'
Oct 10 10:05:30 compute-2 sudo[233789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:30 compute-2 python3[233791]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 10:05:31 compute-2 podman[233831]: 2025-10-10 10:05:31.065885394 +0000 UTC m=+0.055971877 container create a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:05:31 compute-2 podman[233831]: 2025-10-10 10:05:31.034522453 +0000 UTC m=+0.024609026 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 10:05:31 compute-2 python3[233791]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 kolla_start
Oct 10 10:05:31 compute-2 sudo[233789]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:31.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:31 compute-2 ceph-mon[74913]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:05:31 compute-2 sudo[234018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qarltxsqsvjbtkghxguipseqezcryvrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090731.6142282-5005-66010225883276/AnsiballZ_stat.py'
Oct 10 10:05:31 compute-2 sudo[234018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:32.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:32 compute-2 python3.9[234020]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:32 compute-2 sudo[234018]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:32 compute-2 sudo[234173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcyccyzjmojnajipzbdxivwganzxctyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090732.5481966-5031-42706647857103/AnsiballZ_file.py'
Oct 10 10:05:32 compute-2 sudo[234173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:32 compute-2 python3.9[234175]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:05:32 compute-2 sudo[234173]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:33.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:33 compute-2 sudo[234325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjbxzfwzznkasdmtskbnkigwdbritxlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090733.0295084-5031-230863453702335/AnsiballZ_copy.py'
Oct 10 10:05:33 compute-2 sudo[234325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:33 compute-2 python3.9[234327]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090733.0295084-5031-230863453702335/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 10:05:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:33 compute-2 sudo[234325]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:33 compute-2 ceph-mon[74913]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:05:33 compute-2 sudo[234401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slgqjkxlxaawfguexlypuknidapafamo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090733.0295084-5031-230863453702335/AnsiballZ_systemd.py'
Oct 10 10:05:33 compute-2 sudo[234401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:34 compute-2 python3.9[234403]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 10:05:34 compute-2 systemd[1]: Reloading.
Oct 10 10:05:34 compute-2 systemd-rc-local-generator[234430]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:05:34 compute-2 systemd-sysv-generator[234433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:05:34 compute-2 sudo[234401]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:34 compute-2 sudo[234513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsohbycilvdnyygijhkymubtfdcosrbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090733.0295084-5031-230863453702335/AnsiballZ_systemd.py'
Oct 10 10:05:34 compute-2 sudo[234513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003d60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:35 compute-2 python3.9[234515]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 10:05:35 compute-2 systemd[1]: Reloading.
Oct 10 10:05:35 compute-2 systemd-rc-local-generator[234546]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 10:05:35 compute-2 systemd-sysv-generator[234549]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 10:05:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 10:05:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:35.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:35 compute-2 systemd[1]: Starting nova_compute container...
Oct 10 10:05:35 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:05:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:35 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:35 compute-2 podman[234556]: 2025-10-10 10:05:35.59512398 +0000 UTC m=+0.107480509 container init a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:05:35 compute-2 podman[234556]: 2025-10-10 10:05:35.603013551 +0000 UTC m=+0.115370050 container start a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:05:35 compute-2 podman[234556]: nova_compute
Oct 10 10:05:35 compute-2 nova_compute[234571]: + sudo -E kolla_set_configs
Oct 10 10:05:35 compute-2 systemd[1]: Started nova_compute container.
Oct 10 10:05:35 compute-2 sudo[234513]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:35 compute-2 sudo[234575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:05:35 compute-2 sudo[234575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:35 compute-2 sudo[234575]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Validating config file
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying service configuration files
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Deleting /etc/ceph
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Creating directory /etc/ceph
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Writing out command to execute
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:35 compute-2 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 10:05:35 compute-2 nova_compute[234571]: ++ cat /run_command
Oct 10 10:05:35 compute-2 nova_compute[234571]: + CMD=nova-compute
Oct 10 10:05:35 compute-2 nova_compute[234571]: + ARGS=
Oct 10 10:05:35 compute-2 nova_compute[234571]: + sudo kolla_copy_cacerts
Oct 10 10:05:35 compute-2 nova_compute[234571]: + [[ ! -n '' ]]
Oct 10 10:05:35 compute-2 nova_compute[234571]: + . kolla_extend_start
Oct 10 10:05:35 compute-2 nova_compute[234571]: + echo 'Running command: '\''nova-compute'\'''
Oct 10 10:05:35 compute-2 nova_compute[234571]: Running command: 'nova-compute'
Oct 10 10:05:35 compute-2 nova_compute[234571]: + umask 0022
Oct 10 10:05:35 compute-2 nova_compute[234571]: + exec nova-compute
Oct 10 10:05:35 compute-2 ceph-mon[74913]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:05:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:36.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:05:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003d80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:37 compute-2 python3.9[234760]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:37 compute-2 ceph-mon[74913]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:37 compute-2 nova_compute[234571]: 2025-10-10 10:05:37.907 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:37 compute-2 nova_compute[234571]: 2025-10-10 10:05:37.908 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:37 compute-2 nova_compute[234571]: 2025-10-10 10:05:37.908 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:37 compute-2 nova_compute[234571]: 2025-10-10 10:05:37.908 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 10 10:05:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:38.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.061 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.077 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:05:38 compute-2 python3.9[234913]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.790 2 INFO nova.virt.driver [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 10 10:05:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.932 2 INFO nova.compute.provider_config [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.982 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.982 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:38 compute-2 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.083 2 WARNING oslo_config.cfg [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 10 10:05:39 compute-2 nova_compute[234571]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 10 10:05:39 compute-2 nova_compute[234571]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 10 10:05:39 compute-2 nova_compute[234571]: and ``live_migration_inbound_addr`` respectively.
Oct 10 10:05:39 compute-2 nova_compute[234571]: ).  Its value may be silently ignored in the future.
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_secret_uuid        = 21f084a3-af34-5230-afe4-ea5cd24a55f4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.170 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.170 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.171 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.196 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 10 10:05:39 compute-2 systemd[1]: Starting libvirt QEMU daemon...
Oct 10 10:05:39 compute-2 systemd[1]: Started libvirt QEMU daemon.
Oct 10 10:05:39 compute-2 sudo[235089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:05:39 compute-2 sudo[235089]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:39 compute-2 sudo[235089]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.287 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f46c545a5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.290 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f46c545a5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.291 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Connection event '1' reason 'None'
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.315 2 WARNING nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Oct 10 10:05:39 compute-2 nova_compute[234571]: 2025-10-10 10:05:39.316 2 DEBUG nova.virt.libvirt.volume.mount [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 10 10:05:39 compute-2 sudo[235132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Oct 10 10:05:39 compute-2 sudo[235132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:39.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:39 compute-2 python3.9[235066]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 10:05:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:39 compute-2 sudo[235132]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:39 compute-2 sudo[235213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:05:39 compute-2 sudo[235213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:39 compute-2 sudo[235213]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:39 compute-2 sudo[235238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:05:39 compute-2 sudo[235238]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:39 compute-2 ceph-mon[74913]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:39 compute-2 podman[235266]: 2025-10-10 10:05:39.894072511 +0000 UTC m=+0.055685907 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct 10 10:05:39 compute-2 podman[235267]: 2025-10-10 10:05:39.922748765 +0000 UTC m=+0.082105759 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 10:05:39 compute-2 podman[235265]: 2025-10-10 10:05:39.935678678 +0000 UTC m=+0.095225698 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 10 10:05:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:40 compute-2 sudo[235474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvsnkxbyngrmsfozucacbnlronqigxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090739.89119-5212-278790240626336/AnsiballZ_podman_container.py'
Oct 10 10:05:40 compute-2 sudo[235474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.215 2 INFO nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host capabilities <capabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]: 
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <host>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <uuid>55d065af-0252-4401-ad6e-822a36bead06</uuid>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <arch>x86_64</arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model>EPYC-Rome-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <vendor>AMD</vendor>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <microcode version='16777317'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <signature family='23' model='49' stepping='0'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='x2apic'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='tsc-deadline'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='osxsave'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='hypervisor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='tsc_adjust'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='spec-ctrl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='stibp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='arch-capabilities'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='cmp_legacy'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='topoext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='virt-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='lbrv'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='tsc-scale'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='vmcb-clean'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='pause-filter'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='pfthreshold'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='svme-addr-chk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='rdctl-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='skip-l1dfl-vmentry'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='mds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature name='pschange-mc-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <pages unit='KiB' size='4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <pages unit='KiB' size='2048'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <pages unit='KiB' size='1048576'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <power_management>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <suspend_mem/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </power_management>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <iommu support='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <migration_features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <live/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <uri_transports>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <uri_transport>tcp</uri_transport>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <uri_transport>rdma</uri_transport>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </uri_transports>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </migration_features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <topology>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <cells num='1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <cell id='0'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <memory unit='KiB'>7864356</memory>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <pages unit='KiB' size='4'>1966089</pages>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <pages unit='KiB' size='2048'>0</pages>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <distances>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <sibling id='0' value='10'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           </distances>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           <cpus num='8'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:           </cpus>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         </cell>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </cells>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </topology>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <cache>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </cache>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <secmodel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model>selinux</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <doi>0</doi>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </secmodel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <secmodel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model>dac</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <doi>0</doi>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </secmodel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </host>
Oct 10 10:05:40 compute-2 nova_compute[234571]: 
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <guest>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <os_type>hvm</os_type>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <arch name='i686'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <wordsize>32</wordsize>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <domain type='qemu'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <domain type='kvm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <pae/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <nonpae/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <acpi default='on' toggle='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <apic default='on' toggle='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <cpuselection/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <deviceboot/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <disksnapshot default='on' toggle='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <externalSnapshot/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </guest>
Oct 10 10:05:40 compute-2 nova_compute[234571]: 
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <guest>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <os_type>hvm</os_type>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <arch name='x86_64'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <wordsize>64</wordsize>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <domain type='qemu'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <domain type='kvm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <acpi default='on' toggle='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <apic default='on' toggle='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <cpuselection/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <deviceboot/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <disksnapshot default='on' toggle='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <externalSnapshot/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </guest>
Oct 10 10:05:40 compute-2 nova_compute[234571]: 
Oct 10 10:05:40 compute-2 nova_compute[234571]: </capabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]: 
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.229 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 10 10:05:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:40 compute-2 sudo[235238]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.259 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 10 10:05:40 compute-2 nova_compute[234571]: <domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <domain>kvm</domain>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <arch>i686</arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <vcpu max='4096'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <iothreads supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <os supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='firmware'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <loader supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>rom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pflash</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='readonly'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>yes</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='secure'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </loader>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </os>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='maximumMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <vendor>AMD</vendor>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='succor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='custom' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-128'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-256'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-512'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <memoryBacking supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='sourceType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>file</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>anonymous</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>memfd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </memoryBacking>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <disk supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='diskDevice'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>disk</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cdrom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>floppy</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>lun</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>fdc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>sata</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </disk>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <graphics supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vnc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egl-headless</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>dbus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </graphics>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <video supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='modelType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vga</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cirrus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>none</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>bochs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ramfb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </video>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hostdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='mode'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>subsystem</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='startupPolicy'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>mandatory</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>requisite</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>optional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='subsysType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pci</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='capsType'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='pciBackend'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hostdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <rng supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>random</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </rng>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <filesystem supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='driverType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>path</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>handle</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtiofs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </filesystem>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <tpm supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-tis</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-crb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emulator</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>external</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendVersion'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>2.0</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </tpm>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <redirdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </redirdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <channel supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pty</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>unix</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </channel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <crypto supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>qemu</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </crypto>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <interface supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>passt</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </interface>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <panic supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>isa</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>hyperv</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </panic>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <gic supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <genid supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backup supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <async-teardown supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <ps2 supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sev supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sgx supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hyperv supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='features'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>relaxed</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vapic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>spinlocks</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vpindex</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>runtime</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>synic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>stimer</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reset</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vendor_id</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>frequencies</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reenlightenment</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tlbflush</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ipi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>avic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emsr_bitmap</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>xmm_input</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hyperv>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <launchSecurity supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]: </domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.268 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 10 10:05:40 compute-2 nova_compute[234571]: <domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <domain>kvm</domain>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <arch>i686</arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <vcpu max='240'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <iothreads supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <os supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='firmware'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <loader supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>rom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pflash</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='readonly'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>yes</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='secure'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </loader>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </os>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='maximumMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <vendor>AMD</vendor>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='succor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='custom' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-128'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-256'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-512'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <memoryBacking supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='sourceType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>file</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>anonymous</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>memfd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </memoryBacking>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <disk supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='diskDevice'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>disk</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cdrom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>floppy</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>lun</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ide</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>fdc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>sata</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </disk>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <graphics supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vnc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egl-headless</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>dbus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </graphics>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <video supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='modelType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vga</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cirrus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>none</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>bochs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ramfb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </video>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hostdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='mode'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>subsystem</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='startupPolicy'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>mandatory</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>requisite</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>optional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='subsysType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pci</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='capsType'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='pciBackend'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hostdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <rng supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>random</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </rng>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <filesystem supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='driverType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>path</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>handle</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtiofs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </filesystem>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <tpm supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-tis</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-crb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emulator</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>external</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendVersion'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>2.0</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </tpm>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <redirdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </redirdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <channel supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pty</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>unix</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </channel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <crypto supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>qemu</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </crypto>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <interface supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>passt</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </interface>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <panic supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>isa</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>hyperv</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </panic>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <gic supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <genid supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backup supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <async-teardown supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <ps2 supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sev supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sgx supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hyperv supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='features'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>relaxed</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vapic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>spinlocks</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vpindex</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>runtime</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>synic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>stimer</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reset</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vendor_id</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>frequencies</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reenlightenment</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tlbflush</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ipi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>avic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emsr_bitmap</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>xmm_input</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hyperv>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <launchSecurity supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]: </domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.288 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.292 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 10 10:05:40 compute-2 nova_compute[234571]: <domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <domain>kvm</domain>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <arch>x86_64</arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <vcpu max='4096'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <iothreads supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <os supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='firmware'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>efi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <loader supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>rom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pflash</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='readonly'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>yes</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='secure'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>yes</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </loader>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </os>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='maximumMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <vendor>AMD</vendor>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='succor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='custom' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-128'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-256'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-512'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <memoryBacking supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='sourceType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>file</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>anonymous</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>memfd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </memoryBacking>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <disk supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='diskDevice'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>disk</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cdrom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>floppy</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>lun</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>fdc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>sata</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </disk>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <graphics supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vnc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egl-headless</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>dbus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </graphics>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <video supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='modelType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vga</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cirrus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>none</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>bochs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ramfb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </video>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hostdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='mode'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>subsystem</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='startupPolicy'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>mandatory</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>requisite</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>optional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='subsysType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pci</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='capsType'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='pciBackend'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hostdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <rng supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>random</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </rng>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <filesystem supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='driverType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>path</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>handle</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtiofs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </filesystem>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <tpm supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-tis</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-crb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emulator</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>external</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendVersion'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>2.0</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </tpm>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <redirdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </redirdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <channel supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pty</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>unix</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </channel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <crypto supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>qemu</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </crypto>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <interface supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>passt</value>
Oct 10 10:05:40 compute-2 python3.9[235476]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </interface>
Oct 10 10:05:40 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <panic supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>isa</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>hyperv</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </panic>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <gic supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <genid supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backup supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <async-teardown supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <ps2 supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sev supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sgx supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hyperv supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='features'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>relaxed</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vapic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>spinlocks</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vpindex</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>runtime</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>synic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>stimer</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reset</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vendor_id</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>frequencies</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reenlightenment</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tlbflush</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ipi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>avic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emsr_bitmap</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>xmm_input</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hyperv>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <launchSecurity supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]: </domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.342 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 10 10:05:40 compute-2 nova_compute[234571]: <domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <domain>kvm</domain>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <arch>x86_64</arch>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <vcpu max='240'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <iothreads supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <os supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='firmware'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <loader supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>rom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pflash</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='readonly'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>yes</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='secure'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>no</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </loader>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </os>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='maximumMigratable'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>on</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>off</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <vendor>AMD</vendor>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='succor'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <mode name='custom' supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Denverton-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='auto-ibrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amd-psfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='stibp-always-on'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='EPYC-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-128'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-256'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx10-512'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='prefetchiti'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Haswell-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512er'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512pf'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fma4'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tbm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xop'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='amx-tile'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-bf16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-fp16'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bitalg'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrc'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fzrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='la57'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='taa-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xfd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ifma'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cmpccxadd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fbsdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='fsrs'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ibrs-all'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mcdt-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pbrsb-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='psdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='serialize'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vaes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='hle'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='rtm'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512bw'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512cd'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512dq'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512f'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='avx512vl'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='invpcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pcid'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='pku'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='mpx'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='core-capability'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='split-lock-detect'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='cldemote'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='erms'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='gfni'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdir64b'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='movdiri'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='xsaves'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='athlon-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='core2duo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='coreduo-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='n270-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='ss'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <blockers model='phenom-v1'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnow'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <feature name='3dnowext'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </blockers>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </mode>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </cpu>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <memoryBacking supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <enum name='sourceType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>file</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>anonymous</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <value>memfd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </memoryBacking>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <disk supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='diskDevice'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>disk</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cdrom</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>floppy</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>lun</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ide</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>fdc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>sata</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </disk>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <graphics supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vnc</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egl-headless</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>dbus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </graphics>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <video supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='modelType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vga</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>cirrus</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>none</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>bochs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ramfb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </video>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hostdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='mode'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>subsystem</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='startupPolicy'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>mandatory</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>requisite</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>optional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='subsysType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pci</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>scsi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='capsType'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='pciBackend'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hostdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <rng supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtio-non-transitional</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>random</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>egd</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </rng>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <filesystem supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='driverType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>path</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>handle</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>virtiofs</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </filesystem>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <tpm supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-tis</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tpm-crb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emulator</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>external</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendVersion'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>2.0</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </tpm>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <redirdev supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='bus'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>usb</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </redirdev>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <channel supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>pty</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>unix</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </channel>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <crypto supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='type'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>qemu</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendModel'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>builtin</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </crypto>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <interface supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='backendType'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>default</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>passt</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </interface>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <panic supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='model'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>isa</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>hyperv</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </panic>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </devices>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   <features>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <gic supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <genid supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <backup supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <async-teardown supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <ps2 supported='yes'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sev supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <sgx supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <hyperv supported='yes'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       <enum name='features'>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>relaxed</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vapic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>spinlocks</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vpindex</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>runtime</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>synic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>stimer</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reset</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>vendor_id</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>frequencies</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>reenlightenment</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>tlbflush</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>ipi</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>avic</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>emsr_bitmap</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:         <value>xmm_input</value>
Oct 10 10:05:40 compute-2 nova_compute[234571]:       </enum>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     </hyperv>
Oct 10 10:05:40 compute-2 nova_compute[234571]:     <launchSecurity supported='no'/>
Oct 10 10:05:40 compute-2 nova_compute[234571]:   </features>
Oct 10 10:05:40 compute-2 nova_compute[234571]: </domainCapabilities>
Oct 10 10:05:40 compute-2 nova_compute[234571]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.396 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.397 2 INFO nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Secure Boot support detected
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.398 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.398 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.408 2 DEBUG nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.448 2 INFO nova.virt.node [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Determined node identity dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from /var/lib/nova/compute_id
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.468 2 WARNING nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Compute nodes ['dcdfa54c-9f95-46da-9af1-da3e28d81cf0'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 10 10:05:40 compute-2 sudo[235474]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.518 2 INFO nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.588 2 WARNING nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:05:40 compute-2 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.processutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:05:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:40 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:05:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:05:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:05:40 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3983368344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.013 2 DEBUG oslo_concurrency.processutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:05:41 compute-2 systemd[1]: Starting libvirt nodedev daemon...
Oct 10 10:05:41 compute-2 systemd[1]: Started libvirt nodedev daemon.
Oct 10 10:05:41 compute-2 sudo[235709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhxczeoipqvazjsrbktbjctjefeihkeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090740.8608813-5236-144429006296526/AnsiballZ_systemd.py'
Oct 10 10:05:41 compute-2 sudo[235709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.323 2 WARNING nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5218MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.338 2 WARNING nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] No compute node record for compute-2.ctlplane.example.com:dcdfa54c-9f95-46da-9af1-da3e28d81cf0: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host dcdfa54c-9f95-46da-9af1-da3e28d81cf0 could not be found.
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.358 2 INFO nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: dcdfa54c-9f95-46da-9af1-da3e28d81cf0
Oct 10 10:05:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.439 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.440 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:05:41 compute-2 python3.9[235711]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 10:05:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.456 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:05:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:05:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:05:41 compute-2 systemd[1]: Stopping nova_compute container...
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.566 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:05:41 compute-2 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:05:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:41 compute-2 unix_chkpwd[235732]: password check failed for user (root)
Oct 10 10:05:41 compute-2 sshd-session[235686]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:05:41 compute-2 ceph-mon[74913]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3983368344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1003042065' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2390731396' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:41 compute-2 virtqemud[235088]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 10 10:05:41 compute-2 virtqemud[235088]: hostname: compute-2
Oct 10 10:05:41 compute-2 virtqemud[235088]: End of file while reading data: Input/output error
Oct 10 10:05:41 compute-2 systemd[1]: libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope: Deactivated successfully.
Oct 10 10:05:41 compute-2 systemd[1]: libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope: Consumed 3.494s CPU time.
Oct 10 10:05:41 compute-2 conmon[234571]: conmon a677c95a28d87d0fa998 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope/container/memory.events
Oct 10 10:05:41 compute-2 podman[235717]: 2025-10-10 10:05:41.974520705 +0000 UTC m=+0.450426097 container died a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS)
Oct 10 10:05:41 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f-userdata-shm.mount: Deactivated successfully.
Oct 10 10:05:42 compute-2 systemd[1]: var-lib-containers-storage-overlay-c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689-merged.mount: Deactivated successfully.
Oct 10 10:05:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:43 compute-2 podman[235717]: 2025-10-10 10:05:43.395495796 +0000 UTC m=+1.871401108 container cleanup a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:05:43 compute-2 podman[235717]: nova_compute
Oct 10 10:05:43 compute-2 podman[235749]: nova_compute
Oct 10 10:05:43 compute-2 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 10 10:05:43 compute-2 systemd[1]: Stopped nova_compute container.
Oct 10 10:05:43 compute-2 systemd[1]: Starting nova_compute container...
Oct 10 10:05:43 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:05:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:43 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:43 compute-2 podman[235759]: 2025-10-10 10:05:43.571801649 +0000 UTC m=+0.085114836 container init a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:05:43 compute-2 podman[235759]: 2025-10-10 10:05:43.579021679 +0000 UTC m=+0.092334836 container start a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible)
Oct 10 10:05:43 compute-2 podman[235759]: nova_compute
Oct 10 10:05:43 compute-2 nova_compute[235775]: + sudo -E kolla_set_configs
Oct 10 10:05:43 compute-2 systemd[1]: Started nova_compute container.
Oct 10 10:05:43 compute-2 sudo[235709]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Validating config file
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying service configuration files
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /etc/ceph
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Creating directory /etc/ceph
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Writing out command to execute
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:43 compute-2 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 10:05:43 compute-2 nova_compute[235775]: ++ cat /run_command
Oct 10 10:05:43 compute-2 nova_compute[235775]: + CMD=nova-compute
Oct 10 10:05:43 compute-2 nova_compute[235775]: + ARGS=
Oct 10 10:05:43 compute-2 nova_compute[235775]: + sudo kolla_copy_cacerts
Oct 10 10:05:43 compute-2 nova_compute[235775]: + [[ ! -n '' ]]
Oct 10 10:05:43 compute-2 nova_compute[235775]: + . kolla_extend_start
Oct 10 10:05:43 compute-2 nova_compute[235775]: Running command: 'nova-compute'
Oct 10 10:05:43 compute-2 nova_compute[235775]: + echo 'Running command: '\''nova-compute'\'''
Oct 10 10:05:43 compute-2 nova_compute[235775]: + umask 0022
Oct 10 10:05:43 compute-2 nova_compute[235775]: + exec nova-compute
Oct 10 10:05:44 compute-2 ceph-mon[74913]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:05:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:44 compute-2 sudo[235936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnsnikrdfjzmabvablahssyyjxhejakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1760090743.868629-5263-230315442498072/AnsiballZ_podman_container.py'
Oct 10 10:05:44 compute-2 sudo[235936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:05:44 compute-2 sshd-session[235686]: Failed password for root from 80.94.93.176 port 46532 ssh2
Oct 10 10:05:44 compute-2 python3.9[235938]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 10:05:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:44 compute-2 systemd[1]: Started libpod-conmon-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope.
Oct 10 10:05:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:44 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:05:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 10:05:44 compute-2 podman[235965]: 2025-10-10 10:05:44.709731682 +0000 UTC m=+0.158546487 container init 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:05:44 compute-2 podman[235965]: 2025-10-10 10:05:44.723479331 +0000 UTC m=+0.172294106 container start 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 10 10:05:44 compute-2 python3.9[235938]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Applying nova statedir ownership
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 10 10:05:44 compute-2 nova_compute_init[235986]: INFO:nova_statedir:Nova statedir ownership complete
Oct 10 10:05:44 compute-2 systemd[1]: libpod-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope: Deactivated successfully.
Oct 10 10:05:44 compute-2 podman[235987]: 2025-10-10 10:05:44.783979511 +0000 UTC m=+0.030250256 container died 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:05:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003de0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:44 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e-userdata-shm.mount: Deactivated successfully.
Oct 10 10:05:44 compute-2 systemd[1]: var-lib-containers-storage-overlay-e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318-merged.mount: Deactivated successfully.
Oct 10 10:05:44 compute-2 podman[235996]: 2025-10-10 10:05:44.856992699 +0000 UTC m=+0.066698088 container cleanup 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:05:44 compute-2 systemd[1]: libpod-conmon-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope: Deactivated successfully.
Oct 10 10:05:44 compute-2 sudo[235936]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.640 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 10 10:05:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:45 compute-2 sshd-session[199939]: Connection closed by 192.168.122.30 port 57240
Oct 10 10:05:45 compute-2 sshd-session[199936]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:05:45 compute-2 systemd[1]: session-54.scope: Deactivated successfully.
Oct 10 10:05:45 compute-2 systemd[1]: session-54.scope: Consumed 2min 39.116s CPU time.
Oct 10 10:05:45 compute-2 systemd-logind[796]: Session 54 logged out. Waiting for processes to exit.
Oct 10 10:05:45 compute-2 systemd-logind[796]: Removed session 54.
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.774 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:05:45 compute-2 nova_compute[235775]: 2025-10-10 10:05:45.804 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:05:45 compute-2 unix_chkpwd[236071]: password check failed for user (root)
Oct 10 10:05:45 compute-2 sudo[236054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:05:45 compute-2 sudo[236054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:45 compute-2 sudo[236054]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:46 compute-2 ceph-mon[74913]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:46.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.244 2 INFO nova.virt.driver [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.350 2 INFO nova.compute.provider_config [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.358 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.359 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.359 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.442 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.442 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.442 2 WARNING oslo_config.cfg [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 10 10:05:46 compute-2 nova_compute[235775]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 10 10:05:46 compute-2 nova_compute[235775]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 10 10:05:46 compute-2 nova_compute[235775]: and ``live_migration_inbound_addr`` respectively.
Oct 10 10:05:46 compute-2 nova_compute[235775]: ).  Its value may be silently ignored in the future.
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_secret_uuid        = 21f084a3-af34-5230-afe4-ea5cd24a55f4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.527 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.570 2 INFO nova.virt.node [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Determined node identity dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from /var/lib/nova/compute_id
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.571 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.587 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9eb1678760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.590 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9eb1678760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.591 2 INFO nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Connection event '1' reason 'None'
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.596 2 INFO nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host capabilities <capabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]: 
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <host>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <uuid>55d065af-0252-4401-ad6e-822a36bead06</uuid>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <arch>x86_64</arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model>EPYC-Rome-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <vendor>AMD</vendor>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <microcode version='16777317'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <signature family='23' model='49' stepping='0'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='x2apic'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='tsc-deadline'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='osxsave'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='hypervisor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='tsc_adjust'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='spec-ctrl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='stibp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='arch-capabilities'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='cmp_legacy'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='topoext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='virt-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='lbrv'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='tsc-scale'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='vmcb-clean'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='pause-filter'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='pfthreshold'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='svme-addr-chk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='rdctl-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='skip-l1dfl-vmentry'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='mds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature name='pschange-mc-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <pages unit='KiB' size='4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <pages unit='KiB' size='2048'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <pages unit='KiB' size='1048576'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <power_management>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <suspend_mem/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </power_management>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <iommu support='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <migration_features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <live/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <uri_transports>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <uri_transport>tcp</uri_transport>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <uri_transport>rdma</uri_transport>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </uri_transports>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </migration_features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <topology>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <cells num='1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <cell id='0'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <memory unit='KiB'>7864356</memory>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <pages unit='KiB' size='4'>1966089</pages>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <pages unit='KiB' size='2048'>0</pages>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <distances>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <sibling id='0' value='10'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           </distances>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           <cpus num='8'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:           </cpus>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         </cell>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </cells>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </topology>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <cache>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </cache>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <secmodel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model>selinux</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <doi>0</doi>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </secmodel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <secmodel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model>dac</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <doi>0</doi>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </secmodel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </host>
Oct 10 10:05:46 compute-2 nova_compute[235775]: 
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <guest>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <os_type>hvm</os_type>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <arch name='i686'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <wordsize>32</wordsize>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <domain type='qemu'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <domain type='kvm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <pae/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <nonpae/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <acpi default='on' toggle='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <apic default='on' toggle='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <cpuselection/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <deviceboot/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <disksnapshot default='on' toggle='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <externalSnapshot/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </guest>
Oct 10 10:05:46 compute-2 nova_compute[235775]: 
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <guest>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <os_type>hvm</os_type>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <arch name='x86_64'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <wordsize>64</wordsize>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <domain type='qemu'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <domain type='kvm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <acpi default='on' toggle='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <apic default='on' toggle='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <cpuselection/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <deviceboot/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <disksnapshot default='on' toggle='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <externalSnapshot/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </guest>
Oct 10 10:05:46 compute-2 nova_compute[235775]: 
Oct 10 10:05:46 compute-2 nova_compute[235775]: </capabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]: 
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.606 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.613 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 10 10:05:46 compute-2 nova_compute[235775]: <domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <domain>kvm</domain>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <arch>i686</arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <vcpu max='4096'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <iothreads supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <os supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='firmware'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <loader supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>rom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pflash</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='readonly'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>yes</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='secure'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </loader>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </os>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='maximumMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <vendor>AMD</vendor>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='succor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='custom' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-128'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-256'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-512'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <memoryBacking supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='sourceType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>file</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>anonymous</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>memfd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </memoryBacking>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <disk supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='diskDevice'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>disk</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cdrom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>floppy</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>lun</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>fdc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>sata</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <graphics supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vnc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egl-headless</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>dbus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </graphics>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <video supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='modelType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vga</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cirrus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>none</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>bochs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ramfb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </video>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hostdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='mode'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>subsystem</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='startupPolicy'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>mandatory</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>requisite</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>optional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='subsysType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pci</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='capsType'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='pciBackend'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hostdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <rng supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>random</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <filesystem supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='driverType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>path</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>handle</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtiofs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </filesystem>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <tpm supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-tis</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-crb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emulator</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>external</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendVersion'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>2.0</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </tpm>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <redirdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </redirdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <channel supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pty</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>unix</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </channel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <crypto supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>qemu</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </crypto>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <interface supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>passt</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <panic supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>isa</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>hyperv</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </panic>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <gic supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <genid supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backup supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <async-teardown supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <ps2 supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sev supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sgx supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hyperv supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='features'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>relaxed</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vapic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>spinlocks</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vpindex</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>runtime</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>synic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>stimer</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reset</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vendor_id</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>frequencies</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reenlightenment</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tlbflush</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ipi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>avic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emsr_bitmap</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>xmm_input</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hyperv>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <launchSecurity supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]: </domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.618 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 10 10:05:46 compute-2 nova_compute[235775]: <domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <domain>kvm</domain>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <arch>i686</arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <vcpu max='240'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <iothreads supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <os supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='firmware'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <loader supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>rom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pflash</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='readonly'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>yes</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='secure'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </loader>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </os>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='maximumMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <vendor>AMD</vendor>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='succor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='custom' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-128'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-256'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-512'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <memoryBacking supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='sourceType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>file</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>anonymous</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>memfd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </memoryBacking>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <disk supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='diskDevice'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>disk</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cdrom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>floppy</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>lun</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ide</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>fdc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>sata</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <graphics supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vnc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egl-headless</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>dbus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </graphics>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <video supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='modelType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vga</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cirrus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>none</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>bochs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ramfb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </video>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hostdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='mode'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>subsystem</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='startupPolicy'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>mandatory</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>requisite</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>optional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='subsysType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pci</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='capsType'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='pciBackend'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hostdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <rng supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>random</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <filesystem supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='driverType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>path</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>handle</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtiofs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </filesystem>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <tpm supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-tis</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-crb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emulator</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>external</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendVersion'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>2.0</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </tpm>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <redirdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </redirdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <channel supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pty</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>unix</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </channel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <crypto supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>qemu</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </crypto>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <interface supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>passt</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <panic supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>isa</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>hyperv</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </panic>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <gic supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <genid supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backup supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <async-teardown supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <ps2 supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sev supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sgx supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hyperv supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='features'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>relaxed</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vapic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>spinlocks</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vpindex</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>runtime</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>synic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>stimer</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reset</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vendor_id</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>frequencies</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reenlightenment</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tlbflush</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ipi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>avic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emsr_bitmap</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>xmm_input</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hyperv>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <launchSecurity supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]: </domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.661 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.664 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 10 10:05:46 compute-2 nova_compute[235775]: <domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <domain>kvm</domain>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <arch>x86_64</arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <vcpu max='4096'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <iothreads supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <os supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='firmware'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>efi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <loader supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>rom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pflash</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='readonly'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>yes</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='secure'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>yes</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </loader>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </os>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='maximumMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <vendor>AMD</vendor>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='succor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='custom' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-128'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-256'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-512'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <memoryBacking supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='sourceType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>file</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>anonymous</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>memfd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </memoryBacking>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <disk supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='diskDevice'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>disk</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cdrom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>floppy</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>lun</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>fdc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>sata</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <graphics supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vnc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egl-headless</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>dbus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </graphics>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <video supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='modelType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vga</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cirrus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>none</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>bochs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ramfb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </video>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hostdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='mode'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>subsystem</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='startupPolicy'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>mandatory</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>requisite</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>optional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='subsysType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pci</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='capsType'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='pciBackend'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hostdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <rng supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>random</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <filesystem supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='driverType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>path</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>handle</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtiofs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </filesystem>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <tpm supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-tis</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-crb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emulator</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>external</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendVersion'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>2.0</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </tpm>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <redirdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </redirdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <channel supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pty</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>unix</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </channel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <crypto supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>qemu</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </crypto>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <interface supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>passt</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <panic supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>isa</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>hyperv</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </panic>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <gic supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <genid supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backup supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <async-teardown supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <ps2 supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sev supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sgx supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hyperv supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='features'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>relaxed</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vapic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>spinlocks</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vpindex</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>runtime</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>synic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>stimer</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reset</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vendor_id</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>frequencies</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reenlightenment</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tlbflush</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ipi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>avic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emsr_bitmap</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>xmm_input</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hyperv>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <launchSecurity supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]: </domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.733 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 10 10:05:46 compute-2 nova_compute[235775]: <domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <path>/usr/libexec/qemu-kvm</path>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <domain>kvm</domain>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <arch>x86_64</arch>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <vcpu max='240'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <iothreads supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <os supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='firmware'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <loader supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>rom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pflash</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='readonly'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>yes</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='secure'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>no</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </loader>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </os>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-passthrough' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='hostPassthroughMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='maximum' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='maximumMigratable'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>on</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>off</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='host-model' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <vendor>AMD</vendor>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='x2apic'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-deadline'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='hypervisor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc_adjust'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='spec-ctrl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='stibp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='arch-capabilities'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='cmp_legacy'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='overflow-recov'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='succor'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='amd-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='virt-ssbd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lbrv'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='tsc-scale'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='vmcb-clean'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='flushbyasid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pause-filter'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pfthreshold'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='svme-addr-chk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rdctl-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='mds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='pschange-mc-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='gds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='require' name='rfds-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <feature policy='disable' name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <mode name='custom' supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Broadwell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cascadelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Cooperlake-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Denverton-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Dhyana-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Genoa-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='auto-ibrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Milan-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amd-psfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='no-nested-data-bp'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='null-sel-clr-base'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='stibp-always-on'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-Rome-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='EPYC-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='GraniteRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-128'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-256'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx10-512'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='prefetchiti'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Haswell-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-noTSX'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v6'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Icelake-Server-v7'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='IvyBridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='KnightsMill-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4fmaps'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-4vnniw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512er'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512pf'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G4-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Opteron_G5-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fma4'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tbm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xop'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SapphireRapids-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='amx-tile'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-bf16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-fp16'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512-vpopcntdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bitalg'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vbmi2'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrc'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fzrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='la57'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='taa-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='tsx-ldtrk'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xfd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='SierraForest-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ifma'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-ne-convert'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx-vnni-int8'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='bus-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cmpccxadd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fbsdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='fsrs'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ibrs-all'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mcdt-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pbrsb-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='psdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='sbdr-ssdp-no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='serialize'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vaes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='vpclmulqdq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Client-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='hle'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='rtm'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Skylake-Server-v5'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512bw'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512cd'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512dq'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512f'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='avx512vl'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='invpcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pcid'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='pku'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='mpx'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v2'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v3'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='core-capability'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='split-lock-detect'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='Snowridge-v4'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='cldemote'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='erms'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='gfni'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdir64b'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='movdiri'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='xsaves'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='athlon-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='core2duo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='coreduo-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='n270-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='ss'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <blockers model='phenom-v1'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnow'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <feature name='3dnowext'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </blockers>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </mode>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <memoryBacking supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <enum name='sourceType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>file</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>anonymous</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <value>memfd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </memoryBacking>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <disk supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='diskDevice'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>disk</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cdrom</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>floppy</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>lun</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ide</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>fdc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>sata</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <graphics supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vnc</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egl-headless</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>dbus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </graphics>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <video supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='modelType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vga</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>cirrus</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>none</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>bochs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ramfb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </video>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hostdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='mode'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>subsystem</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='startupPolicy'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>mandatory</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>requisite</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>optional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='subsysType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pci</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>scsi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='capsType'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='pciBackend'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hostdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <rng supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtio-non-transitional</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>random</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>egd</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <filesystem supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='driverType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>path</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>handle</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>virtiofs</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </filesystem>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <tpm supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-tis</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tpm-crb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emulator</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>external</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendVersion'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>2.0</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </tpm>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <redirdev supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='bus'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>usb</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </redirdev>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <channel supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>pty</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>unix</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </channel>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <crypto supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='type'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>qemu</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendModel'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>builtin</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </crypto>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <interface supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='backendType'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>default</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>passt</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <panic supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='model'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>isa</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>hyperv</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </panic>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   <features>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <gic supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <vmcoreinfo supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <genid supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backingStoreInput supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <backup supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <async-teardown supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <ps2 supported='yes'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sev supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <sgx supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <hyperv supported='yes'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       <enum name='features'>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>relaxed</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vapic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>spinlocks</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vpindex</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>runtime</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>synic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>stimer</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reset</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>vendor_id</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>frequencies</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>reenlightenment</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>tlbflush</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>ipi</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>avic</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>emsr_bitmap</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:         <value>xmm_input</value>
Oct 10 10:05:46 compute-2 nova_compute[235775]:       </enum>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     </hyperv>
Oct 10 10:05:46 compute-2 nova_compute[235775]:     <launchSecurity supported='no'/>
Oct 10 10:05:46 compute-2 nova_compute[235775]:   </features>
Oct 10 10:05:46 compute-2 nova_compute[235775]: </domainCapabilities>
Oct 10 10:05:46 compute-2 nova_compute[235775]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.796 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.796 2 INFO nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Secure Boot support detected
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.798 2 INFO nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.799 2 INFO nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.808 2 DEBUG nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.886 2 DEBUG nova.virt.libvirt.volume.mount [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.912 2 INFO nova.virt.node [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Determined node identity dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from /var/lib/nova/compute_id
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.930 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Verified node dcdfa54c-9f95-46da-9af1-da3e28d81cf0 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Oct 10 10:05:46 compute-2 nova_compute[235775]: 2025-10-10 10:05:46.976 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 10 10:05:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:05:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1757370242' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.061 2 ERROR nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Could not retrieve compute node resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dcdfa54c-9f95-46da-9af1-da3e28d81cf0' not found: No resource provider with uuid dcdfa54c-9f95-46da-9af1-da3e28d81cf0 found  ", "request_id": "req-61e1089c-3cae-4b97-a196-3693469c8ecf"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dcdfa54c-9f95-46da-9af1-da3e28d81cf0' not found: No resource provider with uuid dcdfa54c-9f95-46da-9af1-da3e28d81cf0 found  ", "request_id": "req-61e1089c-3cae-4b97-a196-3693469c8ecf"}]}
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.081 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.082 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.082 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.082 2 DEBUG nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.082 2 DEBUG oslo_concurrency.processutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:05:47 compute-2 rsyslogd[1001]: imjournal from <np0005479823:nova_compute>: begin to drop messages due to rate-limiting
Oct 10 10:05:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:47.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.531 2 DEBUG oslo_concurrency.processutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:05:47 compute-2 sshd-session[235686]: Failed password for root from 80.94.93.176 port 46532 ssh2
Oct 10 10:05:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.710 2 WARNING nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.711 2 DEBUG nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5196MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.711 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.711 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.933 2 ERROR nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dcdfa54c-9f95-46da-9af1-da3e28d81cf0' not found: No resource provider with uuid dcdfa54c-9f95-46da-9af1-da3e28d81cf0 found  ", "request_id": "req-cf76886f-f9b0-4784-853f-d6a0b7369247"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider 'dcdfa54c-9f95-46da-9af1-da3e28d81cf0' not found: No resource provider with uuid dcdfa54c-9f95-46da-9af1-da3e28d81cf0 found  ", "request_id": "req-cf76886f-f9b0-4784-853f-d6a0b7369247"}]}
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.934 2 DEBUG nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:05:47 compute-2 nova_compute[235775]: 2025-10-10 10:05:47.935 2 DEBUG nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:05:47 compute-2 unix_chkpwd[236125]: password check failed for user (root)
Oct 10 10:05:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:48.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.076 2 INFO nova.scheduler.client.report [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [req-636038da-e3d1-4fe6-9349-53e2c22c4956] Created resource provider record via placement API for resource provider with UUID dcdfa54c-9f95-46da-9af1-da3e28d81cf0 and name compute-2.ctlplane.example.com.
Oct 10 10:05:48 compute-2 ceph-mon[74913]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/39140762' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3139530236' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1942422153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.107 2 DEBUG oslo_concurrency.processutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:05:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:05:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/425651605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.572 2 DEBUG oslo_concurrency.processutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.578 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 10 10:05:48 compute-2 nova_compute[235775]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.578 2 INFO nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] kernel doesn't support AMD SEV
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.579 2 DEBUG nova.compute.provider_tree [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.579 2 DEBUG nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 10 10:05:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.635 2 DEBUG nova.scheduler.client.report [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Updated inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.636 2 DEBUG nova.compute.provider_tree [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Updating resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.636 2 DEBUG nova.compute.provider_tree [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:05:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.787 2 DEBUG nova.compute.provider_tree [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Updating resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.826 2 DEBUG nova.compute.resource_tracker [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.827 2 DEBUG oslo_concurrency.lockutils [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.827 2 DEBUG nova.service [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Oct 10 10:05:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.899 2 DEBUG nova.service [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Oct 10 10:05:48 compute-2 nova_compute[235775]: 2025-10-10 10:05:48.900 2 DEBUG nova.servicegroup.drivers.db [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Oct 10 10:05:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/425651605' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3204732520' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:05:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:49.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:05:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 5994 writes, 24K keys, 5994 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5994 writes, 1097 syncs, 5.46 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 448 writes, 699 keys, 448 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 448 writes, 217 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 10:05:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:50.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:50 compute-2 ceph-mon[74913]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:50 compute-2 sshd-session[235686]: Failed password for root from 80.94.93.176 port 46532 ssh2
Oct 10 10:05:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:52 compute-2 sshd-session[235686]: Received disconnect from 80.94.93.176 port 46532:11:  [preauth]
Oct 10 10:05:52 compute-2 sshd-session[235686]: Disconnected from authenticating user root 80.94.93.176 port 46532 [preauth]
Oct 10 10:05:52 compute-2 sshd-session[235686]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:05:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:52.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:52 compute-2 ceph-mon[74913]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:52 compute-2 unix_chkpwd[236158]: password check failed for user (root)
Oct 10 10:05:52 compute-2 sshd-session[236154]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:05:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:53.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:53 compute-2 podman[236159]: 2025-10-10 10:05:53.819193852 +0000 UTC m=+0.091921573 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 10:05:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:05:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:54.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:05:54 compute-2 ceph-mon[74913]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:05:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:54 compute-2 sshd-session[236154]: Failed password for root from 80.94.93.176 port 24984 ssh2
Oct 10 10:05:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:05:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:05:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:55.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:05:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:55 compute-2 sudo[236181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:05:55 compute-2 sudo[236181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:05:55 compute-2 sudo[236181]: pam_unix(sudo:session): session closed for user root
Oct 10 10:05:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:56.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:56 compute-2 ceph-mon[74913]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:56 compute-2 unix_chkpwd[236208]: password check failed for user (root)
Oct 10 10:05:57 compute-2 ceph-mon[74913]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:05:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:05:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:57.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:05:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:58.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:05:59 compute-2 sshd-session[236154]: Failed password for root from 80.94.93.176 port 24984 ssh2
Oct 10 10:05:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:05:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:05:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:59.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:05:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:05:59 compute-2 ceph-mon[74913]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:00.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:00 compute-2 unix_chkpwd[236213]: password check failed for user (root)
Oct 10 10:06:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:01.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:01 compute-2 ceph-mon[74913]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:06:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:02.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:03.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:03 compute-2 sshd-session[236154]: Failed password for root from 80.94.93.176 port 24984 ssh2
Oct 10 10:06:03 compute-2 ceph-mon[74913]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:06:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:04.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 49 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:05 compute-2 sshd-session[236154]: Received disconnect from 80.94.93.176 port 24984:11:  [preauth]
Oct 10 10:06:05 compute-2 sshd-session[236154]: Disconnected from authenticating user root 80.94.93.176 port 24984 [preauth]
Oct 10 10:06:05 compute-2 sshd-session[236154]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:06:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:05.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:05 compute-2 ceph-mon[74913]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:05 compute-2 unix_chkpwd[236221]: password check failed for user (root)
Oct 10 10:06:05 compute-2 sshd-session[236219]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:06:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:06.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:07.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:07 compute-2 ceph-mon[74913]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:08.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:08 compute-2 sshd-session[236219]: Failed password for root from 80.94.93.176 port 31898 ssh2
Oct 10 10:06:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:09.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:09 compute-2 ceph-mon[74913]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:09 compute-2 unix_chkpwd[236226]: password check failed for user (root)
Oct 10 10:06:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:10.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:10 compute-2 podman[236228]: 2025-10-10 10:06:10.799769343 +0000 UTC m=+0.070607003 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:06:10 compute-2 podman[236230]: 2025-10-10 10:06:10.807698536 +0000 UTC m=+0.070498829 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 10 10:06:10 compute-2 podman[236229]: 2025-10-10 10:06:10.835157133 +0000 UTC m=+0.096396026 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 10 10:06:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:11 compute-2 ceph-mon[74913]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:12 compute-2 sshd-session[236219]: Failed password for root from 80.94.93.176 port 31898 ssh2
Oct 10 10:06:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:13.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:06:13 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1790607881' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:06:13 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1790607881' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:13 compute-2 ceph-mon[74913]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:06:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1790607881' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1790607881' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2805633632' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2805633632' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:06:13 compute-2 unix_chkpwd[236295]: password check failed for user (root)
Oct 10 10:06:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:14.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2939433712' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:06:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2939433712' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:06:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:15.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:15 compute-2 sshd-session[236219]: Failed password for root from 80.94.93.176 port 31898 ssh2
Oct 10 10:06:15 compute-2 sudo[236298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:06:15 compute-2 sudo[236298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:15 compute-2 sudo[236298]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:15 compute-2 ceph-mon[74913]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:16 compute-2 sshd-session[236219]: Received disconnect from 80.94.93.176 port 31898:11:  [preauth]
Oct 10 10:06:16 compute-2 sshd-session[236219]: Disconnected from authenticating user root 80.94.93.176 port 31898 [preauth]
Oct 10 10:06:16 compute-2 sshd-session[236219]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 10 10:06:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:16.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ec0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003fe0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:06:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:17.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:17 compute-2 ceph-mon[74913]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:19.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:19 compute-2 ceph-mon[74913]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:20.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:21.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:21 compute-2 ceph-mon[74913]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:06:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:22.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:23.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100623 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:06:24 compute-2 ceph-mon[74913]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:06:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:24.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:24 compute-2 podman[236332]: 2025-10-10 10:06:24.788503485 +0000 UTC m=+0.069007602 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 10:06:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:24 compute-2 nova_compute[235775]: 2025-10-10 10:06:24.901 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:24 compute-2 nova_compute[235775]: 2025-10-10 10:06:24.966 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:25.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:26 compute-2 ceph-mon[74913]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:26.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:27.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:28 compute-2 ceph-mon[74913]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:28.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:06:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:29.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:06:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:30 compute-2 ceph-mon[74913]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:30.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0040a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:31.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:32 compute-2 ceph-mon[74913]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:06:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:32.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003ee0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0040c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:33.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:34 compute-2 ceph-mon[74913]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:06:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:34.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:35.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:35 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:06:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:35 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:06:35 compute-2 sudo[236362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:06:35 compute-2 sudo[236362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:35 compute-2 sudo[236362]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:36 compute-2 ceph-mon[74913]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:06:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:36.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0040e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:37.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:38.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:38 compute-2 ceph-mon[74913]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003fb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004100 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:39.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:40.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:40 compute-2 ceph-mon[74913]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:06:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003fd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004120 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:41.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:06:41.458 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:06:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:06:41.458 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:06:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:06:41.459 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:06:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:41 compute-2 podman[236394]: 2025-10-10 10:06:41.776787283 +0000 UTC m=+0.050470920 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Oct 10 10:06:41 compute-2 podman[236395]: 2025-10-10 10:06:41.801910404 +0000 UTC m=+0.075272022 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 10 10:06:41 compute-2 podman[236396]: 2025-10-10 10:06:41.809208568 +0000 UTC m=+0.079336572 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct 10 10:06:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:42.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:42 compute-2 ceph-mon[74913]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:06:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:43.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100643 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:06:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:44.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:44 compute-2 ceph-mon[74913]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:06:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0041d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:45 compute-2 ceph-mon[74913]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:06:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:45.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.818 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.818 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.819 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.819 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.835 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.836 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.836 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.839 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.871 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.872 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.872 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.872 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:06:45 compute-2 nova_compute[235775]: 2025-10-10 10:06:45.872 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:06:46 compute-2 sudo[236483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:06:46 compute-2 sudo[236483]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:46 compute-2 sudo[236483]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:46 compute-2 sudo[236508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:06:46 compute-2 sudo[236508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:46.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3155134759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:06:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1920469742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.309 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.457 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.459 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5236MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.459 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.459 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.530 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.530 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:06:46 compute-2 nova_compute[235775]: 2025-10-10 10:06:46.556 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:06:46 compute-2 sudo[236508]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:06:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2676868760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 nova_compute[235775]: 2025-10-10 10:06:47.001 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:06:47 compute-2 nova_compute[235775]: 2025-10-10 10:06:47.006 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:06:47 compute-2 nova_compute[235775]: 2025-10-10 10:06:47.048 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:06:47 compute-2 nova_compute[235775]: 2025-10-10 10:06:47.049 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:06:47 compute-2 nova_compute[235775]: 2025-10-10 10:06:47.049 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1920469742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2009889929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:06:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2676868760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:47.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:48.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3854743988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ff0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004210 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:49 compute-2 ceph-mon[74913]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:06:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1886198263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:06:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:49.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:50.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:51.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:51 compute-2 sudo[236595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:06:51 compute-2 sudo[236595]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:51 compute-2 sudo[236595]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:51 compute-2 ceph-mon[74913]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:06:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:06:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:06:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:52.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100652 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:53.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:53 compute-2 ceph-mon[74913]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:06:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:54.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:06:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:55.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:55 compute-2 podman[236624]: 2025-10-10 10:06:55.772830788 +0000 UTC m=+0.049325484 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:06:55 compute-2 ceph-mon[74913]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:56 compute-2 sudo[236644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:06:56 compute-2 sudo[236644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:06:56 compute-2 sudo[236644]: pam_unix(sudo:session): session closed for user root
Oct 10 10:06:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:56.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:57.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:57 compute-2 ceph-mon[74913]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:06:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:06:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:06:58.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:06:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:06:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:06:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:06:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:06:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:06:59.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:06:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:06:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:06:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:06:59 compute-2 ceph-mon[74913]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:07:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:00.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:01.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:01 compute-2 ceph-mon[74913]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:07:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:07:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:02.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:03.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:03 compute-2 ceph-mon[74913]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Oct 10 10:07:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:03 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:07:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:03 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:07:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:04.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:05.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Oct 10 10:07:05 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4284610426' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:05 compute-2 ceph-mon[74913]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 596 B/s wr, 1 op/s
Oct 10 10:07:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/919671696' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/4284610426' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:06.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:06 compute-2 rsyslogd[1001]: imjournal: 2505 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004350 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:06 compute-2 ceph-mon[74913]: from='client.24518 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Oct 10 10:07:06 compute-2 ceph-mon[74913]: from='client.24518 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Oct 10 10:07:06 compute-2 ceph-mon[74913]: from='client.24748 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Oct 10 10:07:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:07:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:07.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100707 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:07:07 compute-2 ceph-mon[74913]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:07:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:08.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:09.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:09 compute-2 ceph-mon[74913]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 938 B/s wr, 152 op/s
Oct 10 10:07:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:10.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:11.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:12 compute-2 ceph-mon[74913]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 938 B/s wr, 152 op/s
Oct 10 10:07:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:12.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100712 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:07:12 compute-2 podman[236687]: 2025-10-10 10:07:12.804142558 +0000 UTC m=+0.078391422 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 10:07:12 compute-2 podman[236690]: 2025-10-10 10:07:12.836758159 +0000 UTC m=+0.095729625 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 10 10:07:12 compute-2 podman[236688]: 2025-10-10 10:07:12.836769329 +0000 UTC m=+0.110507906 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:13.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:14 compute-2 ceph-mon[74913]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 938 B/s wr, 152 op/s
Oct 10 10:07:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:15.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:16 compute-2 ceph-mon[74913]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 341 B/s wr, 150 op/s
Oct 10 10:07:16 compute-2 sudo[236752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:07:16 compute-2 sudo[236752]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:16 compute-2 sudo[236752]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:16 compute-2 kernel: ganesha.nfsd[236682]: segfault at 50 ip 00007fa91a7fc32e sp 00007fa8cfffe210 error 4 in libntirpc.so.5.8[7fa91a7e1000+2c000] likely on CPU 1 (core 0, socket 1)
Oct 10 10:07:16 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:07:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy ignored for local
Oct 10 10:07:16 compute-2 systemd[1]: Started Process Core Dump (PID 236779/UID 0).
Oct 10 10:07:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:07:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:18 compute-2 systemd-coredump[236780]: Process 221552 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 63:
                                                    #0  0x00007fa91a7fc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:07:18 compute-2 ceph-mon[74913]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 341 B/s wr, 150 op/s
Oct 10 10:07:18 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:07:18 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:07:18 compute-2 systemd[1]: systemd-coredump@7-236779-0.service: Deactivated successfully.
Oct 10 10:07:18 compute-2 systemd[1]: systemd-coredump@7-236779-0.service: Consumed 1.162s CPU time.
Oct 10 10:07:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:18 compute-2 podman[236786]: 2025-10-10 10:07:18.192515476 +0000 UTC m=+0.025101691 container died 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 10:07:18 compute-2 systemd[1]: var-lib-containers-storage-overlay-ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2-merged.mount: Deactivated successfully.
Oct 10 10:07:18 compute-2 podman[236786]: 2025-10-10 10:07:18.229592448 +0000 UTC m=+0.062178643 container remove 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 10:07:18 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:07:18 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:07:18 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.663s CPU time.
Oct 10 10:07:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:19.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:20 compute-2 ceph-mon[74913]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 938 B/s wr, 152 op/s
Oct 10 10:07:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:20.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:21.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:22 compute-2 ceph-mon[74913]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 10:07:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:22.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100722 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:07:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:23.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:24 compute-2 ceph-mon[74913]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:07:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:24.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Oct 10 10:07:25 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1743770823' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3351629822' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1743770823' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 10:07:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:25.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:26 compute-2 ceph-mon[74913]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:07:26 compute-2 ceph-mon[74913]: from='client.15102 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Oct 10 10:07:26 compute-2 ceph-mon[74913]: from='client.15102 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Oct 10 10:07:26 compute-2 ceph-mon[74913]: from='client.24766 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Oct 10 10:07:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:26.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:07:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:07:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:07:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:07:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:26 compute-2 podman[236836]: 2025-10-10 10:07:26.801937721 +0000 UTC m=+0.076375389 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 10:07:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:07:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:07:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100727 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:07:28 compute-2 ceph-mon[74913]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 852 B/s wr, 2 op/s
Oct 10 10:07:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:28.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:28 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 8.
Oct 10 10:07:28 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:07:28 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.663s CPU time.
Oct 10 10:07:28 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:28 compute-2 podman[236908]: 2025-10-10 10:07:28.679181229 +0000 UTC m=+0.050916087 container create 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:07:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:07:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:07:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:07:28 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:07:28 compute-2 podman[236908]: 2025-10-10 10:07:28.659266443 +0000 UTC m=+0.031001321 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:07:28 compute-2 podman[236908]: 2025-10-10 10:07:28.756231898 +0000 UTC m=+0.127966786 container init 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True)
Oct 10 10:07:28 compute-2 podman[236908]: 2025-10-10 10:07:28.760763254 +0000 UTC m=+0.132498112 container start 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Oct 10 10:07:28 compute-2 bash[236908]: 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5
Oct 10 10:07:28 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:07:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:29.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:30 compute-2 ceph-mon[74913]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:07:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:30.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:31 compute-2 ceph-mon[74913]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Oct 10 10:07:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:31.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:32.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:07:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:33 compute-2 ceph-mon[74913]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 767 B/s wr, 2 op/s
Oct 10 10:07:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:33.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100733 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:07:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:34.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:07:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:07:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:07:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:35.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:35 compute-2 ceph-mon[74913]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Oct 10 10:07:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:36.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:36 compute-2 sudo[236973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:07:36 compute-2 sudo[236973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:36 compute-2 sudo[236973]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:37.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:37 compute-2 ceph-mon[74913]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 511 B/s wr, 1 op/s
Oct 10 10:07:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:38.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:07:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:39.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:39 compute-2 ceph-mon[74913]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 511 B/s wr, 2 op/s
Oct 10 10:07:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:40.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:07:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:07:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:07:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:41.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:41 compute-2 ceph-mon[74913]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:07:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:42.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:43.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:43 compute-2 podman[237008]: 2025-10-10 10:07:43.793675514 +0000 UTC m=+0.058886481 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 10:07:43 compute-2 podman[237007]: 2025-10-10 10:07:43.802108943 +0000 UTC m=+0.077443993 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:07:43 compute-2 podman[237006]: 2025-10-10 10:07:43.817504344 +0000 UTC m=+0.094034123 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:07:43 compute-2 ceph-mon[74913]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 767 B/s wr, 3 op/s
Oct 10 10:07:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:44.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:45.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:45 compute-2 ceph-mon[74913]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 341 B/s wr, 2 op/s
Oct 10 10:07:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:46.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d90000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:07:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.040 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.040 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.075 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:07:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:47.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.834 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.834 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:07:47 compute-2 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:07:47 compute-2 ceph-mon[74913]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 341 B/s wr, 2 op/s
Oct 10 10:07:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4016404872' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3155772583' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1191965780' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:07:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/85312783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.288 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.434 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.435 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5259MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.435 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.436 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.507 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100748 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:07:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4252308067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.951 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.958 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:07:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/85312783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3798602020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4252308067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.975 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.978 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:07:48 compute-2 nova_compute[235775]: 2025-10-10 10:07:48.978 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:07:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:49.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:50 compute-2 ceph-mon[74913]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Oct 10 10:07:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:50.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:51.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:07:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:51 compute-2 sudo[237138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:07:51 compute-2 sudo[237138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:51 compute-2 sudo[237138]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:51 compute-2 sudo[237163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:07:51 compute-2 sudo[237163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:52 compute-2 ceph-mon[74913]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Oct 10 10:07:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:52.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:52 compute-2 sudo[237163]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:07:53 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:07:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100753 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:07:54 compute-2 ceph-mon[74913]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Oct 10 10:07:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:07:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:55.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:56 compute-2 ceph-mon[74913]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:07:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:56.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:56 compute-2 sudo[237224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:07:56 compute-2 sudo[237224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:56 compute-2 sudo[237224]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:07:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:57.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.548366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877548450, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2368, "num_deletes": 251, "total_data_size": 6249742, "memory_usage": 6358064, "flush_reason": "Manual Compaction"}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877575466, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4069084, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20848, "largest_seqno": 23210, "table_properties": {"data_size": 4059605, "index_size": 5973, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19602, "raw_average_key_size": 20, "raw_value_size": 4040653, "raw_average_value_size": 4165, "num_data_blocks": 262, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090666, "oldest_key_time": 1760090666, "file_creation_time": 1760090877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 27163 microseconds, and 15864 cpu microseconds.
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.575541) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4069084 bytes OK
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.575571) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577272) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577296) EVENT_LOG_v1 {"time_micros": 1760090877577288, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577328) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6239323, prev total WAL file size 6275848, number of live WAL files 2.
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.580022) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3973KB)], [39(12MB)]
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877580077, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16934462, "oldest_snapshot_seqno": -1}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5420 keys, 14721321 bytes, temperature: kUnknown
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877664661, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14721321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14683056, "index_size": 23627, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 136693, "raw_average_key_size": 25, "raw_value_size": 14582880, "raw_average_value_size": 2690, "num_data_blocks": 976, "num_entries": 5420, "num_filter_entries": 5420, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.664995) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14721321 bytes
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.666117) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 200.0 rd, 173.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.3 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 5940, records dropped: 520 output_compression: NoCompression
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.666133) EVENT_LOG_v1 {"time_micros": 1760090877666125, "job": 22, "event": "compaction_finished", "compaction_time_micros": 84669, "compaction_time_cpu_micros": 49161, "output_level": 6, "num_output_files": 1, "total_output_size": 14721321, "num_input_records": 5940, "num_output_records": 5420, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877667002, "job": 22, "event": "table_file_deletion", "file_number": 41}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877669172, "job": 22, "event": "table_file_deletion", "file_number": 39}
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.579958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:07:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:57 compute-2 sudo[237257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:07:57 compute-2 sudo[237257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:07:57 compute-2 sudo[237257]: pam_unix(sudo:session): session closed for user root
Oct 10 10:07:57 compute-2 podman[237251]: 2025-10-10 10:07:57.820737277 +0000 UTC m=+0.078459906 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:07:58 compute-2 ceph-mon[74913]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:07:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:07:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:07:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:07:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:07:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:07:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:07:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:07:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:59.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:07:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:07:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:00 compute-2 ceph-mon[74913]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:08:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:00.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:01.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:02 compute-2 ceph-mon[74913]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Oct 10 10:08:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:08:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:02.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:03.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:04 compute-2 ceph-mon[74913]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 255 B/s wr, 1 op/s
Oct 10 10:08:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:05.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:06 compute-2 ceph-mon[74913]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:07.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:08 compute-2 ceph-mon[74913]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:08.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:09.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:10 compute-2 ceph-mon[74913]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:10.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:12 compute-2 ceph-mon[74913]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:12.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:13 compute-2 ceph-mon[74913]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:08:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:08:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:13.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:08:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:14.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:14 compute-2 podman[237314]: 2025-10-10 10:08:14.7789432 +0000 UTC m=+0.054160611 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct 10 10:08:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:14 compute-2 podman[237312]: 2025-10-10 10:08:14.793650109 +0000 UTC m=+0.067355241 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 10 10:08:14 compute-2 podman[237313]: 2025-10-10 10:08:14.824657679 +0000 UTC m=+0.102793442 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 10:08:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:15.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:15 compute-2 ceph-mon[74913]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:16.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:16 compute-2 sudo[237377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:08:16 compute-2 sudo[237377]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:08:16 compute-2 sudo[237377]: pam_unix(sudo:session): session closed for user root
Oct 10 10:08:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:08:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:17.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:17 compute-2 ceph-mon[74913]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:18.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:19.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:19 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.604 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:08:19 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.606 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:08:19 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.607 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:08:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:19 compute-2 ceph-mon[74913]: pgmap v663: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:20.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:21.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:21 compute-2 ceph-mon[74913]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:22.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:23.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:23 compute-2 ceph-mon[74913]: pgmap v665: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:08:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:24.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:25.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:25 compute-2 ceph-mon[74913]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/954143266' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:08:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/954143266' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:08:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:27.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:27 compute-2 ceph-mon[74913]: pgmap v667: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:28.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:28 compute-2 podman[237416]: 2025-10-10 10:08:28.779634901 +0000 UTC m=+0.052358072 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 10:08:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:29.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:30 compute-2 ceph-mon[74913]: pgmap v668: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:30.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:31.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:32 compute-2 ceph-mon[74913]: pgmap v669: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:08:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:32.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:33.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:34 compute-2 ceph-mon[74913]: pgmap v670: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:08:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:35.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:36 compute-2 ceph-mon[74913]: pgmap v671: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:36.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:36 compute-2 sudo[237443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:08:36 compute-2 sudo[237443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:08:36 compute-2 sudo[237443]: pam_unix(sudo:session): session closed for user root
Oct 10 10:08:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:37.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:38 compute-2 ceph-mon[74913]: pgmap v672: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:08:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:39.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:08:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:40 compute-2 ceph-mon[74913]: pgmap v673: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:08:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:08:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:08:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:41.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:42 compute-2 ceph-mon[74913]: pgmap v674: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:43 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:44 compute-2 ceph-mon[74913]: pgmap v675: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:08:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:45 compute-2 ceph-mon[74913]: pgmap v676: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:45 compute-2 podman[237480]: 2025-10-10 10:08:45.779291246 +0000 UTC m=+0.054247044 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct 10 10:08:45 compute-2 podman[237478]: 2025-10-10 10:08:45.784577834 +0000 UTC m=+0.055679658 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:08:45 compute-2 podman[237479]: 2025-10-10 10:08:45.833649411 +0000 UTC m=+0.111134549 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:08:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:08:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.975 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:46 compute-2 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:47 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:47 compute-2 ceph-mon[74913]: pgmap v677: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:47.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.836 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:08:47 compute-2 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:08:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:08:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1968454522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.247 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:08:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3922966226' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1968454522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2549355807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.445 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5231MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.502 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.503 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.521 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:08:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:08:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4155216318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.962 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.968 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.990 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.993 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:08:48 compute-2 nova_compute[235775]: 2025-10-10 10:08:48.993 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:08:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:49 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:49 compute-2 ceph-mon[74913]: pgmap v678: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4155216318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2040447484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:49.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:49 compute-2 nova_compute[235775]: 2025-10-10 10:08:49.994 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:49 compute-2 nova_compute[235775]: 2025-10-10 10:08:49.994 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:49 compute-2 nova_compute[235775]: 2025-10-10 10:08:49.995 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:08:49 compute-2 nova_compute[235775]: 2025-10-10 10:08:49.995 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:08:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:50 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1585741613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:08:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:51 compute-2 ceph-mon[74913]: pgmap v679: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:08:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:51.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:53 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:53.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:53 compute-2 ceph-mon[74913]: pgmap v680: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:08:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:55 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:08:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:55.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:55 compute-2 ceph-mon[74913]: pgmap v681: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:08:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:56 compute-2 sudo[237596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:08:56 compute-2 sudo[237596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:08:56 compute-2 sudo[237596]: pam_unix(sudo:session): session closed for user root
Oct 10 10:08:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:57 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:57.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:57 compute-2 ceph-mon[74913]: pgmap v682: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:08:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100857 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:08:57 compute-2 sudo[237623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:08:57 compute-2 sudo[237623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:08:57 compute-2 sudo[237623]: pam_unix(sudo:session): session closed for user root
Oct 10 10:08:58 compute-2 sudo[237648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 10:08:58 compute-2 sudo[237648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:08:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:08:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:58.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:08:58 compute-2 podman[237744]: 2025-10-10 10:08:58.596564214 +0000 UTC m=+0.104186328 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Oct 10 10:08:58 compute-2 podman[237744]: 2025-10-10 10:08:58.693384345 +0000 UTC m=+0.201006439 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Oct 10 10:08:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 10:08:58 compute-2 podman[237798]: 2025-10-10 10:08:58.920220628 +0000 UTC m=+0.080977707 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 10 10:08:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:59 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:08:59 compute-2 podman[237882]: 2025-10-10 10:08:59.170947872 +0000 UTC m=+0.053358204 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:08:59 compute-2 podman[237882]: 2025-10-10 10:08:59.176058197 +0000 UTC m=+0.058468529 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:08:59 compute-2 podman[237974]: 2025-10-10 10:08:59.52447392 +0000 UTC m=+0.064776469 container exec 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 10:08:59 compute-2 podman[237974]: 2025-10-10 10:08:59.537171266 +0000 UTC m=+0.077473715 container exec_died 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:08:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:08:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:08:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:59.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:08:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:08:59 compute-2 podman[238036]: 2025-10-10 10:08:59.801375691 +0000 UTC m=+0.061511505 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:08:59 compute-2 podman[238036]: 2025-10-10 10:08:59.842265698 +0000 UTC m=+0.102401482 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:08:59 compute-2 ceph-mon[74913]: pgmap v683: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:09:00 compute-2 podman[238102]: 2025-10-10 10:09:00.105280585 +0000 UTC m=+0.076814144 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, release=1793, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Oct 10 10:09:00 compute-2 podman[238102]: 2025-10-10 10:09:00.160071014 +0000 UTC m=+0.131604563 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, architecture=x86_64, release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, vcs-type=git)
Oct 10 10:09:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:00.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:00 compute-2 sudo[237648]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:00 compute-2 sudo[238171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:09:00 compute-2 sudo[238171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:00 compute-2 sudo[238171]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:00 compute-2 sudo[238196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:09:00 compute-2 sudo[238196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:01 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:01 compute-2 sudo[238196]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:01 compute-2 ceph-mon[74913]: pgmap v684: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 10:09:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:09:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:01.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:02.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:02 compute-2 PackageKit[172997]: daemon quit
Oct 10 10:09:02 compute-2 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:09:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:03 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:03 compute-2 ceph-mon[74913]: pgmap v685: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Oct 10 10:09:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:04.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:05.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:05 compute-2 ceph-mon[74913]: pgmap v686: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:09:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:06.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:09:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:06 compute-2 sudo[238259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:09:06 compute-2 sudo[238259]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:06 compute-2 sudo[238259]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:07 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:09:07 compute-2 ceph-mon[74913]: pgmap v687: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:09:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:08.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:09:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:09:09 compute-2 ceph-mon[74913]: pgmap v688: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:09:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:10.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:11.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:11 compute-2 ceph-mon[74913]: pgmap v689: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:09:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:12.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:09:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:13 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:13.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:13 compute-2 ceph-mon[74913]: pgmap v690: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:14.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:15 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:15.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:15 compute-2 ceph-mon[74913]: pgmap v691: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:16.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:16 compute-2 sudo[238294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:09:16 compute-2 sudo[238294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:16 compute-2 sudo[238294]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:16 compute-2 podman[238317]: 2025-10-10 10:09:16.80155126 +0000 UTC m=+0.075608576 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:09:16 compute-2 podman[238320]: 2025-10-10 10:09:16.816782345 +0000 UTC m=+0.075684517 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 10 10:09:16 compute-2 podman[238319]: 2025-10-10 10:09:16.817390825 +0000 UTC m=+0.090538541 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 10:09:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:09:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:17 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:17.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100917 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:09:17 compute-2 ceph-mon[74913]: pgmap v692: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:18.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:19 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:19.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:20 compute-2 ceph-mon[74913]: pgmap v693: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1023 B/s wr, 4 op/s
Oct 10 10:09:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:20.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:21 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:21.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:22 compute-2 ceph-mon[74913]: pgmap v694: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 10:09:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:22.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:23 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:23.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:24 compute-2 ceph-mon[74913]: pgmap v695: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 10:09:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:24.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:25 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:25.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:26 compute-2 ceph-mon[74913]: pgmap v696: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:26.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100926 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:09:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:27 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3145259585' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:09:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3145259585' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:09:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:28 compute-2 ceph-mon[74913]: pgmap v697: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:28.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:29 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:29.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:29 compute-2 podman[238396]: 2025-10-10 10:09:29.773472494 +0000 UTC m=+0.052986003 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:09:30 compute-2 ceph-mon[74913]: pgmap v698: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:30.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:31 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:31.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:32 compute-2 ceph-mon[74913]: pgmap v699: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Oct 10 10:09:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:09:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:32.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:33 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:33.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:34 compute-2 ceph-mon[74913]: pgmap v700: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:34.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:35 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:35 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:09:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:36 compute-2 ceph-mon[74913]: pgmap v701: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:36.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:36 compute-2 sudo[238422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:09:36 compute-2 sudo[238422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:36 compute-2 sudo[238422]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:37 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:09:38 compute-2 ceph-mon[74913]: pgmap v702: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:09:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:38.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:39.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:40 compute-2 ceph-mon[74913]: pgmap v703: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:09:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:40.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:09:41 compute-2 ceph-mon[74913]: pgmap v704: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Oct 10 10:09:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:09:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.462 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:09:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.462 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:09:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:41.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:42.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:43 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:43.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:43 compute-2 ceph-mon[74913]: pgmap v705: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.186656) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984186690, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1302, "num_deletes": 250, "total_data_size": 3196741, "memory_usage": 3260136, "flush_reason": "Manual Compaction"}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984197237, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1331552, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23215, "largest_seqno": 24512, "table_properties": {"data_size": 1327081, "index_size": 1995, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11513, "raw_average_key_size": 20, "raw_value_size": 1317447, "raw_average_value_size": 2340, "num_data_blocks": 86, "num_entries": 563, "num_filter_entries": 563, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090877, "oldest_key_time": 1760090877, "file_creation_time": 1760090984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10610 microseconds, and 3768 cpu microseconds.
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.197267) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1331552 bytes OK
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.197281) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200350) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200366) EVENT_LOG_v1 {"time_micros": 1760090984200360, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200381) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3190593, prev total WAL file size 3190593, number of live WAL files 2.
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.201282) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1300KB)], [42(14MB)]
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984201348, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16052873, "oldest_snapshot_seqno": -1}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5512 keys, 12707545 bytes, temperature: kUnknown
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984276144, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12707545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12671786, "index_size": 20865, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13829, "raw_key_size": 138905, "raw_average_key_size": 25, "raw_value_size": 12573186, "raw_average_value_size": 2281, "num_data_blocks": 855, "num_entries": 5512, "num_filter_entries": 5512, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.276371) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12707545 bytes
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.278138) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.4 rd, 169.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.0 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(21.6) write-amplify(9.5) OK, records in: 5983, records dropped: 471 output_compression: NoCompression
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.278155) EVENT_LOG_v1 {"time_micros": 1760090984278147, "job": 24, "event": "compaction_finished", "compaction_time_micros": 74858, "compaction_time_cpu_micros": 34977, "output_level": 6, "num_output_files": 1, "total_output_size": 12707545, "num_input_records": 5983, "num_output_records": 5512, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984278481, "job": 24, "event": "table_file_deletion", "file_number": 44}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984281292, "job": 24, "event": "table_file_deletion", "file_number": 42}
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.201193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:09:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:44.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:09:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:45.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:09:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:45 compute-2 nova_compute[235775]: 2025-10-10 10:09:45.817 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:45 compute-2 nova_compute[235775]: 2025-10-10 10:09:45.817 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:09:45 compute-2 nova_compute[235775]: 2025-10-10 10:09:45.818 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:09:45 compute-2 nova_compute[235775]: 2025-10-10 10:09:45.837 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:09:46 compute-2 ceph-mon[74913]: pgmap v706: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:46.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=404 latency=0.001000032s ======
Oct 10 10:09:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:46.578 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.001000032s
Oct 10 10:09:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:46 compute-2 nova_compute[235775]: 2025-10-10 10:09:46.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:46 compute-2 nova_compute[235775]: 2025-10-10 10:09:46.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100946 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:09:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:47 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:09:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:47.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:47 compute-2 podman[238458]: 2025-10-10 10:09:47.783503717 +0000 UTC m=+0.055543134 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Oct 10 10:09:47 compute-2 podman[238460]: 2025-10-10 10:09:47.793894849 +0000 UTC m=+0.056537606 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 10 10:09:47 compute-2 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:47 compute-2 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:47 compute-2 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:47 compute-2 podman[238459]: 2025-10-10 10:09:47.86689843 +0000 UTC m=+0.125116566 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller)
Oct 10 10:09:48 compute-2 ceph-mon[74913]: pgmap v707: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/177719747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:48.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:48 compute-2 nova_compute[235775]: 2025-10-10 10:09:48.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:48 compute-2 nova_compute[235775]: 2025-10-10 10:09:48.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:09:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:49 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2615600125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2916280922' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:49.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:09:49 compute-2 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:09:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:09:50 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2285751282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:50 compute-2 ceph-mon[74913]: pgmap v708: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:09:50 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4222935933' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.264 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:09:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:50.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.427 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.428 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5238MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.513 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:09:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:09:50 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2966347030' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.955 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.963 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:09:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.992 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.995 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:09:50 compute-2 nova_compute[235775]: 2025-10-10 10:09:50.996 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:09:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2285751282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:51 compute-2 ceph-mon[74913]: pgmap v709: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 2 op/s
Oct 10 10:09:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2966347030' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:09:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Oct 10 10:09:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:51.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:52 compute-2 ceph-mon[74913]: osdmap e144: 3 total, 3 up, 3 in
Oct 10 10:09:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Oct 10 10:09:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:52.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:52 compute-2 nova_compute[235775]: 2025-10-10 10:09:52.996 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:52 compute-2 nova_compute[235775]: 2025-10-10 10:09:52.996 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:09:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:53 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Oct 10 10:09:53 compute-2 ceph-mon[74913]: osdmap e145: 3 total, 3 up, 3 in
Oct 10 10:09:53 compute-2 ceph-mon[74913]: pgmap v712: 353 pgs: 353 active+clean; 8.4 MiB data, 165 MiB used, 60 GiB / 60 GiB avail; 6.6 KiB/s rd, 1.0 MiB/s wr, 10 op/s
Oct 10 10:09:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:53.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Oct 10 10:09:54 compute-2 ceph-mon[74913]: osdmap e146: 3 total, 3 up, 3 in
Oct 10 10:09:54 compute-2 ceph-mon[74913]: osdmap e147: 3 total, 3 up, 3 in
Oct 10 10:09:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:54.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:55 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:09:55 compute-2 ceph-mon[74913]: pgmap v715: 353 pgs: 353 active+clean; 8.4 MiB data, 165 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 20 op/s
Oct 10 10:09:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Oct 10 10:09:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:09:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:55.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:09:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:56.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:56 compute-2 ceph-mon[74913]: osdmap e148: 3 total, 3 up, 3 in
Oct 10 10:09:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:09:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:56 compute-2 sudo[238574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:09:56 compute-2 sudo[238574]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:09:56 compute-2 sudo[238574]: pam_unix(sudo:session): session closed for user root
Oct 10 10:09:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:57 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:57 compute-2 ceph-mon[74913]: pgmap v717: 353 pgs: 353 active+clean; 8.4 MiB data, 165 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 1.8 MiB/s wr, 17 op/s
Oct 10 10:09:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:57.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:09:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:58.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:09:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:59 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:09:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:09:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:09:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:59.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:09:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:09:59 compute-2 ceph-mon[74913]: pgmap v718: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 5.5 MiB/s wr, 50 op/s
Oct 10 10:10:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:00.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:00 compute-2 podman[238602]: 2025-10-10 10:10:00.797789409 +0000 UTC m=+0.066190333 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:10:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:00 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 10:10:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:01 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:01.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:01 compute-2 ceph-mon[74913]: pgmap v719: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 4.4 MiB/s wr, 40 op/s
Oct 10 10:10:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:10:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:02.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:03 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:03.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:03 compute-2 ceph-mon[74913]: pgmap v720: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 3.8 MiB/s wr, 36 op/s
Oct 10 10:10:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:04.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:10:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:05.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:05 compute-2 ceph-mon[74913]: pgmap v721: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 3.3 MiB/s wr, 31 op/s
Oct 10 10:10:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:06.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:06 compute-2 sudo[238630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:10:06 compute-2 sudo[238630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:07 compute-2 sudo[238630]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:07 compute-2 sudo[238655]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:10:07 compute-2 sudo[238655]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:07 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:07.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:07 compute-2 sudo[238655]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:07 compute-2 ceph-mon[74913]: pgmap v722: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.9 MiB/s wr, 27 op/s
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:10:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:10:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:08.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:09.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:09 compute-2 ceph-mon[74913]: pgmap v723: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.7 MiB/s wr, 27 op/s
Oct 10 10:10:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:11 compute-2 ceph-mon[74913]: pgmap v724: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:10:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:10:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:12.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:13 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:13.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:13 compute-2 ceph-mon[74913]: pgmap v725: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Oct 10 10:10:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:14.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:14 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:14.636 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:10:14 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:14.637 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:10:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004790 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:15 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:15.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:15 compute-2 ceph-mon[74913]: pgmap v726: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:10:16 compute-2 sudo[238720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:10:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:16.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:16 compute-2 sudo[238720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:16 compute-2 sudo[238720]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101016 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:10:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:17 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:17 compute-2 sudo[238747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:10:17 compute-2 sudo[238747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:17 compute-2 sudo[238747]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:10:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:10:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:17.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:18 compute-2 ceph-mon[74913]: pgmap v727: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:10:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:18.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:18 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:18.640 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:10:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:18 compute-2 podman[238773]: 2025-10-10 10:10:18.824127752 +0000 UTC m=+0.091478243 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 10:10:18 compute-2 podman[238774]: 2025-10-10 10:10:18.828031536 +0000 UTC m=+0.089663514 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:10:18 compute-2 podman[238780]: 2025-10-10 10:10:18.84570453 +0000 UTC m=+0.099287931 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 10 10:10:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:19 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.204600) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019204635, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 713, "num_deletes": 257, "total_data_size": 1346583, "memory_usage": 1367056, "flush_reason": "Manual Compaction"}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019214039, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 870498, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24517, "largest_seqno": 25225, "table_properties": {"data_size": 867026, "index_size": 1316, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7925, "raw_average_key_size": 18, "raw_value_size": 859782, "raw_average_value_size": 2008, "num_data_blocks": 58, "num_entries": 428, "num_filter_entries": 428, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090985, "oldest_key_time": 1760090985, "file_creation_time": 1760091019, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 9496 microseconds, and 5328 cpu microseconds.
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.214091) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 870498 bytes OK
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.214115) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215920) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215941) EVENT_LOG_v1 {"time_micros": 1760091019215934, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215962) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1342679, prev total WAL file size 1342679, number of live WAL files 2.
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.216857) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(850KB)], [45(12MB)]
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019216906, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13578043, "oldest_snapshot_seqno": -1}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5409 keys, 13422995 bytes, temperature: kUnknown
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019299560, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13422995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13386831, "index_size": 21526, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 137997, "raw_average_key_size": 25, "raw_value_size": 13288926, "raw_average_value_size": 2456, "num_data_blocks": 879, "num_entries": 5409, "num_filter_entries": 5409, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091019, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.299867) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13422995 bytes
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.301079) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.1 rd, 162.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(31.0) write-amplify(15.4) OK, records in: 5940, records dropped: 531 output_compression: NoCompression
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.301099) EVENT_LOG_v1 {"time_micros": 1760091019301090, "job": 26, "event": "compaction_finished", "compaction_time_micros": 82728, "compaction_time_cpu_micros": 45431, "output_level": 6, "num_output_files": 1, "total_output_size": 13422995, "num_input_records": 5940, "num_output_records": 5409, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019301362, "job": 26, "event": "table_file_deletion", "file_number": 47}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019304190, "job": 26, "event": "table_file_deletion", "file_number": 45}
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.216770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:10:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:20 compute-2 ceph-mon[74913]: pgmap v728: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Oct 10 10:10:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:20.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:21 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:22 compute-2 ceph-mon[74913]: pgmap v729: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:10:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:22.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:23 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:23.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:24 compute-2 ceph-mon[74913]: pgmap v730: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:10:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:24.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d58000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:25 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:25 compute-2 ceph-mon[74913]: pgmap v731: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:10:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:25.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:10:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:10:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:10:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:10:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:26.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:10:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:10:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:27 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:27 compute-2 ceph-mon[74913]: pgmap v732: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:10:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:27.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:28.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:29 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:29.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:29 compute-2 ceph-mon[74913]: pgmap v733: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:10:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:30.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:31 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:31.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:31 compute-2 podman[238855]: 2025-10-10 10:10:31.79702922 +0000 UTC m=+0.065949087 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 10:10:31 compute-2 ceph-mon[74913]: pgmap v734: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:10:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:10:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:32.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy ignored for local
Oct 10 10:10:32 compute-2 kernel: ganesha.nfsd[238389]: segfault at 50 ip 00007f9e3fab232e sp 00007f9e0cff8210 error 4 in libntirpc.so.5.8[7f9e3fa97000+2c000] likely on CPU 7 (core 0, socket 7)
Oct 10 10:10:32 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:10:32 compute-2 systemd[1]: Started Process Core Dump (PID 238879/UID 0).
Oct 10 10:10:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:33.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:33 compute-2 ceph-mon[74913]: pgmap v735: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Oct 10 10:10:33 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3328175243' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:34 compute-2 systemd-coredump[238880]: Process 236928 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 60:
                                                    #0  0x00007f9e3fab232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:10:34 compute-2 systemd[1]: systemd-coredump@8-238879-0.service: Deactivated successfully.
Oct 10 10:10:34 compute-2 systemd[1]: systemd-coredump@8-238879-0.service: Consumed 1.172s CPU time.
Oct 10 10:10:34 compute-2 podman[238885]: 2025-10-10 10:10:34.214515272 +0000 UTC m=+0.021911511 container died 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 10:10:34 compute-2 systemd[1]: var-lib-containers-storage-overlay-d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4-merged.mount: Deactivated successfully.
Oct 10 10:10:34 compute-2 podman[238885]: 2025-10-10 10:10:34.253969602 +0000 UTC m=+0.061365821 container remove 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct 10 10:10:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:10:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:10:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.754s CPU time.
Oct 10 10:10:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:34.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:35.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:35 compute-2 ceph-mon[74913]: pgmap v736: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:10:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Oct 10 10:10:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:36.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:36 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Oct 10 10:10:36 compute-2 ceph-mon[74913]: osdmap e149: 3 total, 3 up, 3 in
Oct 10 10:10:37 compute-2 sudo[238933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:10:37 compute-2 sudo[238933]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:37 compute-2 sudo[238933]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:37.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:38 compute-2 ceph-mon[74913]: pgmap v738: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Oct 10 10:10:38 compute-2 ceph-mon[74913]: osdmap e150: 3 total, 3 up, 3 in
Oct 10 10:10:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:10:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:10:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101038 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:10:39 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Oct 10 10:10:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:39.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:40 compute-2 ceph-mon[74913]: pgmap v740: 353 pgs: 353 active+clean; 88 MiB data, 208 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 42 op/s
Oct 10 10:10:40 compute-2 ceph-mon[74913]: osdmap e151: 3 total, 3 up, 3 in
Oct 10 10:10:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/809446276' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:10:41 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1280200693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:10:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:10:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:10:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:10:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:41.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:42 compute-2 ceph-mon[74913]: pgmap v742: 353 pgs: 353 active+clean; 88 MiB data, 208 MiB used, 60 GiB / 60 GiB avail; 3.4 MiB/s rd, 3.5 MiB/s wr, 56 op/s
Oct 10 10:10:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:43 compute-2 ceph-mon[74913]: pgmap v743: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 3.0 MiB/s rd, 3.1 MiB/s wr, 59 op/s
Oct 10 10:10:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:10:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:43.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:10:44 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 9.
Oct 10 10:10:44 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:10:44 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.754s CPU time.
Oct 10 10:10:44 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:10:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:10:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:10:44 compute-2 podman[239015]: 2025-10-10 10:10:44.709954275 +0000 UTC m=+0.042750726 container create 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:10:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:10:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:10:44 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:10:44 compute-2 podman[239015]: 2025-10-10 10:10:44.771442138 +0000 UTC m=+0.104238599 container init 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct 10 10:10:44 compute-2 podman[239015]: 2025-10-10 10:10:44.782658696 +0000 UTC m=+0.115455117 container start 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 10:10:44 compute-2 podman[239015]: 2025-10-10 10:10:44.689598505 +0000 UTC m=+0.022394946 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:10:44 compute-2 bash[239015]: 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31
Oct 10 10:10:44 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:10:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:10:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:10:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:45.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.863 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.864 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.864 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 10 10:10:45 compute-2 nova_compute[235775]: 2025-10-10 10:10:45.877 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:45 compute-2 ceph-mon[74913]: pgmap v744: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 50 op/s
Oct 10 10:10:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:10:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:47.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:47 compute-2 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:47 compute-2 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:47 compute-2 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:47 compute-2 ceph-mon[74913]: pgmap v745: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 41 op/s
Oct 10 10:10:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:48.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3226504384' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:49 compute-2 podman[239080]: 2025-10-10 10:10:49.786853122 +0000 UTC m=+0.056515625 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:10:49 compute-2 podman[239078]: 2025-10-10 10:10:49.803913227 +0000 UTC m=+0.083464047 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 10 10:10:49 compute-2 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:49 compute-2 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:49 compute-2 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:10:49 compute-2 podman[239079]: 2025-10-10 10:10:49.834768592 +0000 UTC m=+0.109954482 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:10:49 compute-2 ceph-mon[74913]: pgmap v746: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 15 KiB/s wr, 74 op/s
Oct 10 10:10:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2896383186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2553159324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:50.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:10:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:10:50 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3253010054' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:51.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.934 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.935 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.935 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.936 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:10:51 compute-2 nova_compute[235775]: 2025-10-10 10:10:51.936 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:10:52 compute-2 ceph-mon[74913]: pgmap v747: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.4 MiB/s rd, 13 KiB/s wr, 64 op/s
Oct 10 10:10:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:10:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1333009736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.406 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:10:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:52.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.546 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5221MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.661 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.661 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:10:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.745 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.769 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.769 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.867 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.893 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 10 10:10:52 compute-2 nova_compute[235775]: 2025-10-10 10:10:52.919 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:10:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1333009736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:10:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/547334843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:53 compute-2 nova_compute[235775]: 2025-10-10 10:10:53.338 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:10:53 compute-2 nova_compute[235775]: 2025-10-10 10:10:53.342 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:10:53 compute-2 nova_compute[235775]: 2025-10-10 10:10:53.360 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:10:53 compute-2 nova_compute[235775]: 2025-10-10 10:10:53.361 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:10:53 compute-2 nova_compute[235775]: 2025-10-10 10:10:53.361 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:10:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:53.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:54 compute-2 ceph-mon[74913]: pgmap v748: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 82 op/s
Oct 10 10:10:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/547334843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:10:54 compute-2 nova_compute[235775]: 2025-10-10 10:10:54.361 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:10:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:54.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:10:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:55.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:56 compute-2 ceph-mon[74913]: pgmap v749: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 76 op/s
Oct 10 10:10:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:56.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:10:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca40016c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:57 compute-2 sudo[239210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:10:57 compute-2 sudo[239210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:10:57 compute-2 sudo[239210]: pam_unix(sudo:session): session closed for user root
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:57.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:10:57 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 10:10:58 compute-2 ceph-mon[74913]: pgmap v750: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 76 op/s
Oct 10 10:10:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:10:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:58.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:10:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:58 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101059 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:10:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:10:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:10:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:10:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:10:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:59.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:00 compute-2 ceph-mon[74913]: pgmap v751: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 79 op/s
Oct 10 10:11:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:00.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:00 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:02 compute-2 ceph-mon[74913]: pgmap v752: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 584 KiB/s rd, 938 B/s wr, 23 op/s
Oct 10 10:11:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:11:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:02.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:02 compute-2 podman[239240]: 2025-10-10 10:11:02.809190701 +0000 UTC m=+0.074052165 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:11:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:02 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:03.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:04 compute-2 ceph-mon[74913]: pgmap v753: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 903 KiB/s rd, 2.1 MiB/s wr, 85 op/s
Oct 10 10:11:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:04.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:04 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:11:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:05.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:11:06 compute-2 ceph-mon[74913]: pgmap v754: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:11:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:06.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:06 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:11:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:07.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:11:08 compute-2 ceph-mon[74913]: pgmap v755: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:11:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:11:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:08.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:11:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:08 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:09.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:10 compute-2 ceph-mon[74913]: pgmap v756: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 333 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Oct 10 10:11:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:10.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:10 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:12 compute-2 ceph-mon[74913]: pgmap v757: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 324 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:11:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:12.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:12 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:11:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:11:14 compute-2 ceph-mon[74913]: pgmap v758: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:11:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:14.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:14 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:15.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:16 compute-2 ceph-mon[74913]: pgmap v759: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Oct 10 10:11:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:16.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:16 compute-2 sudo[239273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:11:16 compute-2 sudo[239273]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:16 compute-2 sudo[239273]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:16 compute-2 sudo[239299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:11:16 compute-2 sudo[239299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:16 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:11:17 compute-2 sudo[239344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:11:17 compute-2 sudo[239344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:17 compute-2 sudo[239344]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:17 compute-2 sudo[239299]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:18 compute-2 ceph-mon[74913]: pgmap v760: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 16 KiB/s wr, 1 op/s
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:11:18 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:11:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:18.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:18 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3887558386' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.255391) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079255431, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 883, "num_deletes": 251, "total_data_size": 1687246, "memory_usage": 1713376, "flush_reason": "Manual Compaction"}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079263121, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1112996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25230, "largest_seqno": 26108, "table_properties": {"data_size": 1108996, "index_size": 1716, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9395, "raw_average_key_size": 19, "raw_value_size": 1100701, "raw_average_value_size": 2307, "num_data_blocks": 77, "num_entries": 477, "num_filter_entries": 477, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091020, "oldest_key_time": 1760091020, "file_creation_time": 1760091079, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 7799 microseconds, and 3684 cpu microseconds.
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.263183) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1112996 bytes OK
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.263209) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264754) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264808) EVENT_LOG_v1 {"time_micros": 1760091079264797, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264889) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1682734, prev total WAL file size 1682734, number of live WAL files 2.
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.265874) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1086KB)], [48(12MB)]
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079265950, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14535991, "oldest_snapshot_seqno": -1}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5368 keys, 12453248 bytes, temperature: kUnknown
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079337523, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12453248, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12418205, "index_size": 20533, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137880, "raw_average_key_size": 25, "raw_value_size": 12321652, "raw_average_value_size": 2295, "num_data_blocks": 834, "num_entries": 5368, "num_filter_entries": 5368, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091079, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.337779) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12453248 bytes
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.339672) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.9 rd, 173.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 12.8 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(24.2) write-amplify(11.2) OK, records in: 5886, records dropped: 518 output_compression: NoCompression
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.339694) EVENT_LOG_v1 {"time_micros": 1760091079339685, "job": 28, "event": "compaction_finished", "compaction_time_micros": 71655, "compaction_time_cpu_micros": 25533, "output_level": 6, "num_output_files": 1, "total_output_size": 12453248, "num_input_records": 5886, "num_output_records": 5368, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079340090, "job": 28, "event": "table_file_deletion", "file_number": 50}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079342653, "job": 28, "event": "table_file_deletion", "file_number": 48}
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.265731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:11:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:19.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:20 compute-2 ceph-mon[74913]: pgmap v761: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 17 KiB/s wr, 1 op/s
Oct 10 10:11:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:20.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:20 compute-2 podman[239383]: 2025-10-10 10:11:20.809181933 +0000 UTC m=+0.077000130 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:11:20 compute-2 podman[239385]: 2025-10-10 10:11:20.834972135 +0000 UTC m=+0.089661263 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid)
Oct 10 10:11:20 compute-2 podman[239384]: 2025-10-10 10:11:20.836944649 +0000 UTC m=+0.099331313 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:11:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:20 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:21 compute-2 ceph-mon[74913]: pgmap v762: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 4.3 KiB/s wr, 0 op/s
Oct 10 10:11:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:22.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:22 compute-2 sudo[239447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:11:22 compute-2 sudo[239447]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:22 compute-2 sudo[239447]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:22 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:22.750 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:11:22 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:22.751 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:11:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:22 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:23 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:11:23 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:11:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:23.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:24 compute-2 ceph-mon[74913]: pgmap v763: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:11:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:24.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:24 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:25 compute-2 ceph-mon[74913]: pgmap v764: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:11:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 10:11:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:25.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 10:11:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/46960711' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:11:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/46960711' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:11:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:26.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:26.753 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:11:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:26 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:27 compute-2 ceph-mon[74913]: pgmap v765: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:11:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3305200522' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:11:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:27.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101128 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:11:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:28.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3006987527' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:11:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:28 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:29 compute-2 ceph-mon[74913]: pgmap v766: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:11:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:29.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:30.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:30 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:31 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:31 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:31 compute-2 ceph-mon[74913]: pgmap v767: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:11:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:11:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:32.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:32 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:33 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:33 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:33 compute-2 podman[239485]: 2025-10-10 10:11:33.77730261 +0000 UTC m=+0.055322437 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:11:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:33.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:33 compute-2 ceph-mon[74913]: pgmap v768: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 44 op/s
Oct 10 10:11:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:34 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:35 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:35 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:35.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:35 compute-2 ceph-mon[74913]: pgmap v769: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 7.4 KiB/s rd, 15 KiB/s wr, 10 op/s
Oct 10 10:11:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:36.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:36 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900037f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:11:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:37 compute-2 sudo[239509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:11:37 compute-2 sudo[239509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:37 compute-2 sudo[239509]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:37.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:38 compute-2 ceph-mon[74913]: pgmap v770: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 7.4 KiB/s rd, 15 KiB/s wr, 10 op/s
Oct 10 10:11:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:38.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:38 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c0010d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:39 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003810 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:39 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:39.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:40 compute-2 ceph-mon[74913]: pgmap v771: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 76 op/s
Oct 10 10:11:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:11:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:11:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:40.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:41 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:41 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c001250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:11:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:11:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:11:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:41.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:42 compute-2 ceph-mon[74913]: pgmap v772: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 76 op/s
Oct 10 10:11:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:42.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:42 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:11:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:43.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:44 compute-2 ceph-mon[74913]: pgmap v773: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 77 op/s
Oct 10 10:11:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:44.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:45 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:45 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:45 compute-2 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:45 compute-2 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:11:45 compute-2 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:11:45 compute-2 nova_compute[235775]: 2025-10-10 10:11:45.828 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:11:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:45.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:46 compute-2 ceph-mon[74913]: pgmap v774: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.0 KiB/s wr, 67 op/s
Oct 10 10:11:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:11:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:46.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:11:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:46 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:47 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:11:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:47 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:47 compute-2 nova_compute[235775]: 2025-10-10 10:11:47.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:47 compute-2 nova_compute[235775]: 2025-10-10 10:11:47.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:47.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:48 compute-2 ceph-mon[74913]: pgmap v775: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.0 KiB/s wr, 67 op/s
Oct 10 10:11:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:48.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:48 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:49 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:49 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:49 compute-2 nova_compute[235775]: 2025-10-10 10:11:49.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:49 compute-2 nova_compute[235775]: 2025-10-10 10:11:49.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:49.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101150 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:11:50 compute-2 ceph-mon[74913]: pgmap v776: 353 pgs: 353 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 130 op/s
Oct 10 10:11:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:50.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:51 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:51 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:51 compute-2 podman[239550]: 2025-10-10 10:11:51.78905866 +0000 UTC m=+0.065388249 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001)
Oct 10 10:11:51 compute-2 podman[239552]: 2025-10-10 10:11:51.795614439 +0000 UTC m=+0.064943024 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:11:51 compute-2 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:51 compute-2 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:51 compute-2 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:51 compute-2 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:11:51 compute-2 podman[239551]: 2025-10-10 10:11:51.818551401 +0000 UTC m=+0.091890564 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 10 10:11:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:51.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:52 compute-2 ceph-mon[74913]: pgmap v777: 353 pgs: 353 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:11:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1869105366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:52.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:11:52 compute-2 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:11:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:52 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:53 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1716989008' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3689505701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:53 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:11:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3217553215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.261 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.413 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.414 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5186MB free_disk=59.89714813232422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.414 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.415 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.478 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.479 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.498 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:11:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:53.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:11:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3655824811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.905 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.912 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.956 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.957 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:11:53 compute-2 nova_compute[235775]: 2025-10-10 10:11:53.958 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:11:54 compute-2 ceph-mon[74913]: pgmap v778: 353 pgs: 353 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Oct 10 10:11:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3217553215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/846796813' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3655824811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:54 compute-2 nova_compute[235775]: 2025-10-10 10:11:54.958 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:11:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3776578443' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:11:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:55.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:56 compute-2 ceph-mon[74913]: pgmap v779: 353 pgs: 353 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:11:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:56.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3981716478' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:11:57 compute-2 sudo[239662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:11:57 compute-2 sudo[239662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:11:57 compute-2 sudo[239662]: pam_unix(sudo:session): session closed for user root
Oct 10 10:11:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:57.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:58 compute-2 ceph-mon[74913]: pgmap v780: 353 pgs: 353 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:11:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:58.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:11:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:11:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:11:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:11:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:11:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:59.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:00 compute-2 ceph-mon[74913]: pgmap v781: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 341 KiB/s rd, 2.1 MiB/s wr, 91 op/s
Oct 10 10:12:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:00.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:01.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:02 compute-2 ceph-mon[74913]: pgmap v782: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Oct 10 10:12:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:12:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:02.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2472464452' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:04 compute-2 ceph-mon[74913]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 16 KiB/s wr, 56 op/s
Oct 10 10:12:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:04 compute-2 podman[239694]: 2025-10-10 10:12:04.788519057 +0000 UTC m=+0.052886120 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:12:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:05 compute-2 ceph-mon[74913]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 3.3 KiB/s wr, 55 op/s
Oct 10 10:12:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:05.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:06.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004150 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:07.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:07 compute-2 ceph-mon[74913]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 3.3 KiB/s wr, 55 op/s
Oct 10 10:12:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:08.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:09 compute-2 ceph-mon[74913]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 3.3 KiB/s wr, 56 op/s
Oct 10 10:12:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:10.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:11.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:11 compute-2 ceph-mon[74913]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Oct 10 10:12:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:12.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:13 compute-2 ceph-mon[74913]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Oct 10 10:12:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:14.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:15.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:16 compute-2 ceph-mon[74913]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:16.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:12:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:17 compute-2 sudo[239730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:12:17 compute-2 sudo[239730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:17 compute-2 sudo[239730]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:17.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:18 compute-2 ceph-mon[74913]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:18.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:19.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:20 compute-2 ceph-mon[74913]: pgmap v791: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Oct 10 10:12:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:20.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:22 compute-2 ceph-mon[74913]: pgmap v792: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:22.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:22 compute-2 sudo[239760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:12:22 compute-2 sudo[239760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:22 compute-2 sudo[239760]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:22 compute-2 sudo[239803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:12:22 compute-2 sudo[239803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:22 compute-2 podman[239784]: 2025-10-10 10:12:22.798953971 +0000 UTC m=+0.069044175 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:12:22 compute-2 podman[239786]: 2025-10-10 10:12:22.814811327 +0000 UTC m=+0.075966685 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:12:22 compute-2 podman[239785]: 2025-10-10 10:12:22.827699579 +0000 UTC m=+0.090431498 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 10 10:12:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:23 compute-2 sudo[239803]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:23.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:24 compute-2 ceph-mon[74913]: pgmap v793: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:12:24 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:12:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:24.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:25.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:26 compute-2 ceph-mon[74913]: pgmap v794: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4121639621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:26.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/44782465' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:12:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/44782465' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:12:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:27.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:28 compute-2 ceph-mon[74913]: pgmap v795: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Oct 10 10:12:28 compute-2 sudo[239909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:12:28 compute-2 sudo[239909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:28 compute-2 sudo[239909]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:28.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:29 compute-2 kernel: ganesha.nfsd[239724]: segfault at 50 ip 00007f6d606f432e sp 00007f6d24ff8210 error 4 in libntirpc.so.5.8[7f6d606d9000+2c000] likely on CPU 5 (core 0, socket 5)
Oct 10 10:12:29 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:12:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy ignored for local
Oct 10 10:12:29 compute-2 systemd[1]: Started Process Core Dump (PID 239936/UID 0).
Oct 10 10:12:29 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:12:29 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:12:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:29.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:30 compute-2 systemd-coredump[239937]: Process 239035 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007f6d606f432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:12:30 compute-2 ceph-mon[74913]: pgmap v796: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:12:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:30 compute-2 systemd[1]: systemd-coredump@9-239936-0.service: Deactivated successfully.
Oct 10 10:12:30 compute-2 systemd[1]: systemd-coredump@9-239936-0.service: Consumed 1.188s CPU time.
Oct 10 10:12:30 compute-2 podman[239942]: 2025-10-10 10:12:30.373640986 +0000 UTC m=+0.026125995 container died 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Oct 10 10:12:30 compute-2 systemd[1]: var-lib-containers-storage-overlay-88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520-merged.mount: Deactivated successfully.
Oct 10 10:12:30 compute-2 podman[239942]: 2025-10-10 10:12:30.41790686 +0000 UTC m=+0.070391899 container remove 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 10:12:30 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:12:30 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:12:30 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.658s CPU time.
Oct 10 10:12:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:30.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:31 compute-2 ceph-mon[74913]: pgmap v797: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:12:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:31.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:12:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:32.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:33 compute-2 ceph-mon[74913]: pgmap v798: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:12:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:33.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:34 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3163061778' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:12:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101235 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:12:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:35 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/543711216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:12:35 compute-2 ceph-mon[74913]: pgmap v799: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:12:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:35 compute-2 podman[239993]: 2025-10-10 10:12:35.776707817 +0000 UTC m=+0.049403788 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 10 10:12:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:35.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:36 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:36.013 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:12:36 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:36.014 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:12:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:36.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:37.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:37 compute-2 sudo[240016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:12:37 compute-2 sudo[240016]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:37 compute-2 sudo[240016]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:37 compute-2 ceph-mon[74913]: pgmap v800: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:12:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:38.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:39 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:39.016 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:12:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:39.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:39 compute-2 ceph-mon[74913]: pgmap v801: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 710 KiB/s rd, 1.8 MiB/s wr, 60 op/s
Oct 10 10:12:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:40.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:40 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 10.
Oct 10 10:12:40 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:12:40 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.658s CPU time.
Oct 10 10:12:40 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:12:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:40 compute-2 podman[240092]: 2025-10-10 10:12:40.898085695 +0000 UTC m=+0.040313868 container create a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Oct 10 10:12:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:12:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:12:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:12:40 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:12:40 compute-2 podman[240092]: 2025-10-10 10:12:40.950058765 +0000 UTC m=+0.092286958 container init a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:12:40 compute-2 podman[240092]: 2025-10-10 10:12:40.956399627 +0000 UTC m=+0.098627810 container start a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Oct 10 10:12:40 compute-2 bash[240092]: a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24
Oct 10 10:12:40 compute-2 podman[240092]: 2025-10-10 10:12:40.879578545 +0000 UTC m=+0.021806748 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:12:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:40 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:12:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:40 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:12:40 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:12:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:12:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:12:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:41.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:41 compute-2 ceph-mon[74913]: pgmap v802: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 693 KiB/s rd, 12 KiB/s wr, 33 op/s
Oct 10 10:12:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:42.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:12:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:43.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:44 compute-2 ceph-mon[74913]: pgmap v803: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Oct 10 10:12:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:44.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:45.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:46 compute-2 ceph-mon[74913]: pgmap v804: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Oct 10 10:12:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:12:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:47 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:12:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:47 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:12:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:47 compute-2 nova_compute[235775]: 2025-10-10 10:12:47.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:47 compute-2 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:47 compute-2 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:12:47 compute-2 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:12:47 compute-2 nova_compute[235775]: 2025-10-10 10:12:47.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:12:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:47.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:48 compute-2 ceph-mon[74913]: pgmap v805: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 74 op/s
Oct 10 10:12:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:48.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:49 compute-2 nova_compute[235775]: 2025-10-10 10:12:49.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:49.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:50 compute-2 ceph-mon[74913]: pgmap v806: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 77 op/s
Oct 10 10:12:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:50.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:50 compute-2 nova_compute[235775]: 2025-10-10 10:12:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:51 compute-2 nova_compute[235775]: 2025-10-10 10:12:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:51 compute-2 nova_compute[235775]: 2025-10-10 10:12:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:51 compute-2 nova_compute[235775]: 2025-10-10 10:12:51.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:12:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:52 compute-2 ceph-mon[74913]: pgmap v807: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.2 MiB/s rd, 937 B/s wr, 44 op/s
Oct 10 10:12:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:12:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:52.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:12:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.816 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:12:52 compute-2 nova_compute[235775]: 2025-10-10 10:12:52.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:12:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1175062627' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:12:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031376606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.305 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.432 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.433 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5209MB free_disk=59.94288635253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.433 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.434 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.508 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.508 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.525 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:53 compute-2 podman[240219]: 2025-10-10 10:12:53.771224183 +0000 UTC m=+0.051672931 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 10:12:53 compute-2 podman[240221]: 2025-10-10 10:12:53.771660087 +0000 UTC m=+0.047843189 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:12:53 compute-2 podman[240220]: 2025-10-10 10:12:53.798510584 +0000 UTC m=+0.074594882 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:12:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:12:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1943454956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:53.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.940 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.945 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.967 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.969 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:12:53 compute-2 nova_compute[235775]: 2025-10-10 10:12:53.969 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:12:54 compute-2 ceph-mon[74913]: pgmap v808: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 2.1 MiB/s wr, 107 op/s
Oct 10 10:12:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4031376606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/881918671' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1943454956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:54.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:54 compute-2 nova_compute[235775]: 2025-10-10 10:12:54.968 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:12:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce0001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1031806664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:12:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:55.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:56 compute-2 ceph-mon[74913]: pgmap v809: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Oct 10 10:12:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2515670112' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:12:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:56.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc4000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101257 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce0001c40 fd 49 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:57.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:58 compute-2 sudo[240290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:12:58 compute-2 sudo[240290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:12:58 compute-2 sudo[240290]: pam_unix(sudo:session): session closed for user root
Oct 10 10:12:58 compute-2 ceph-mon[74913]: pgmap v810: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Oct 10 10:12:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:12:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:58.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:12:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc4000fa0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:12:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:12:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:12:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:12:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:59.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:13:00 compute-2 ceph-mon[74913]: pgmap v811: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 328 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Oct 10 10:13:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:00.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc40020e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:02 compute-2 ceph-mon[74913]: pgmap v812: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 322 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:13:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:13:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:02.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:13:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:03.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:13:04 compute-2 ceph-mon[74913]: pgmap v813: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:13:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:04.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:05 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy ignored for local
Oct 10 10:13:05 compute-2 kernel: ganesha.nfsd[240196]: segfault at 50 ip 00007fbd94a5932e sp 00007fbd517f9210 error 4 in libntirpc.so.5.8[7fbd94a3e000+2c000] likely on CPU 4 (core 0, socket 4)
Oct 10 10:13:05 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:13:05 compute-2 systemd[1]: Started Process Core Dump (PID 240323/UID 0).
Oct 10 10:13:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:05.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:06 compute-2 systemd-coredump[240324]: Process 240111 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 55:
                                                    #0  0x00007fbd94a5932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:13:06 compute-2 ceph-mon[74913]: pgmap v814: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 937 B/s rd, 14 KiB/s wr, 1 op/s
Oct 10 10:13:06 compute-2 systemd[1]: systemd-coredump@10-240323-0.service: Deactivated successfully.
Oct 10 10:13:06 compute-2 systemd[1]: systemd-coredump@10-240323-0.service: Consumed 1.196s CPU time.
Oct 10 10:13:06 compute-2 podman[240330]: 2025-10-10 10:13:06.394991156 +0000 UTC m=+0.025346221 container died a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Oct 10 10:13:06 compute-2 systemd[1]: var-lib-containers-storage-overlay-f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f-merged.mount: Deactivated successfully.
Oct 10 10:13:06 compute-2 podman[240330]: 2025-10-10 10:13:06.424784547 +0000 UTC m=+0.055139602 container remove a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Oct 10 10:13:06 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:13:06 compute-2 podman[240329]: 2025-10-10 10:13:06.469561656 +0000 UTC m=+0.075020866 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:13:06 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:13:06 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.294s CPU time.
Oct 10 10:13:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:06.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:07.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:08 compute-2 ceph-mon[74913]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 14 KiB/s wr, 1 op/s
Oct 10 10:13:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:08.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:09.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:10 compute-2 ceph-mon[74913]: pgmap v816: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 15 KiB/s wr, 1 op/s
Oct 10 10:13:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:10.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101311 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:13:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:11.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:12 compute-2 ceph-mon[74913]: pgmap v817: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.0 KiB/s wr, 0 op/s
Oct 10 10:13:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:12.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:13 compute-2 ceph-mon[74913]: pgmap v818: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 7.7 KiB/s rd, 3.0 KiB/s wr, 1 op/s
Oct 10 10:13:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:13.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:14.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:15.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:15 compute-2 ceph-mon[74913]: pgmap v819: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 7.6 KiB/s rd, 1023 B/s wr, 1 op/s
Oct 10 10:13:16 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 11.
Oct 10 10:13:16 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:13:16 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.294s CPU time.
Oct 10 10:13:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:16.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:16 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:13:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:16 compute-2 podman[240449]: 2025-10-10 10:13:16.897486107 +0000 UTC m=+0.037370684 container create 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Oct 10 10:13:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:13:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:13:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:13:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:13:16 compute-2 podman[240449]: 2025-10-10 10:13:16.957077229 +0000 UTC m=+0.096961846 container init 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 10:13:16 compute-2 podman[240449]: 2025-10-10 10:13:16.966939084 +0000 UTC m=+0.106823661 container start 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 10:13:16 compute-2 bash[240449]: 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da
Oct 10 10:13:16 compute-2 podman[240449]: 2025-10-10 10:13:16.880180854 +0000 UTC m=+0.020065461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:13:16 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:13:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:16 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:13:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:16 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:13:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:17.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:18 compute-2 ceph-mon[74913]: pgmap v820: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 7.6 KiB/s rd, 1023 B/s wr, 1 op/s
Oct 10 10:13:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2423340283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:18 compute-2 sudo[240506]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:13:18 compute-2 sudo[240506]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:18 compute-2 sudo[240506]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:18.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:19.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:20 compute-2 ceph-mon[74913]: pgmap v821: 353 pgs: 353 active+clean; 41 MiB data, 271 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 7.9 KiB/s wr, 29 op/s
Oct 10 10:13:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:20.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:21.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:22 compute-2 ceph-mon[74913]: pgmap v822: 353 pgs: 353 active+clean; 41 MiB data, 271 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 6.9 KiB/s wr, 29 op/s
Oct 10 10:13:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:22.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:13:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:13:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:23.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:24 compute-2 ceph-mon[74913]: pgmap v823: 353 pgs: 353 active+clean; 41 MiB data, 271 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 7.4 KiB/s wr, 30 op/s
Oct 10 10:13:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:24.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:24 compute-2 podman[240539]: 2025-10-10 10:13:24.805580189 +0000 UTC m=+0.069327274 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 10:13:24 compute-2 podman[240541]: 2025-10-10 10:13:24.807255913 +0000 UTC m=+0.068797648 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct 10 10:13:24 compute-2 podman[240540]: 2025-10-10 10:13:24.845133522 +0000 UTC m=+0.104896320 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 10 10:13:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:25.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:26 compute-2 ceph-mon[74913]: pgmap v824: 353 pgs: 353 active+clean; 41 MiB data, 271 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 7.4 KiB/s wr, 29 op/s
Oct 10 10:13:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:26.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3183478379' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:13:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3183478379' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:13:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:27.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:28 compute-2 ceph-mon[74913]: pgmap v825: 353 pgs: 353 active+clean; 41 MiB data, 271 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 7.4 KiB/s wr, 29 op/s
Oct 10 10:13:28 compute-2 sudo[240606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:13:28 compute-2 sudo[240606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:28 compute-2 sudo[240606]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:28 compute-2 sudo[240632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:13:28 compute-2 sudo[240632]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:28.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 10:13:29 compute-2 sudo[240632]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:13:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:13:30 compute-2 ceph-mon[74913]: pgmap v826: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 7.8 KiB/s wr, 31 op/s
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:13:30 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:13:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:30.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:31.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:32 compute-2 ceph-mon[74913]: pgmap v827: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:13:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:13:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:32.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101333 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:33.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:34 compute-2 ceph-mon[74913]: pgmap v828: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Oct 10 10:13:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:13:34 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:13:34 compute-2 sudo[240709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:13:34 compute-2 sudo[240709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:34 compute-2 sudo[240709]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:34.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:35.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:36 compute-2 ceph-mon[74913]: pgmap v829: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:13:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:13:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:36.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:13:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:36 compute-2 podman[240737]: 2025-10-10 10:13:36.809519155 +0000 UTC m=+0.070322617 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 10 10:13:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:37 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:37.855 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:13:37 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:37.856 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:13:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:38 compute-2 ceph-mon[74913]: pgmap v830: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:13:38 compute-2 sudo[240757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:13:38 compute-2 sudo[240757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:38 compute-2 sudo[240757]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:38.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:40 compute-2 ceph-mon[74913]: pgmap v831: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Oct 10 10:13:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:13:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.466 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:13:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.466 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:13:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:41.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:42 compute-2 ceph-mon[74913]: pgmap v832: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Oct 10 10:13:42 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1897484170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:42.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:42 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:13:42.858 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:13:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101344 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:13:44 compute-2 ceph-mon[74913]: pgmap v833: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:13:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:44.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:45 compute-2 ceph-mon[74913]: pgmap v834: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:13:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:45.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:13:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:46.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1301345308' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:13:47 compute-2 ceph-mon[74913]: pgmap v835: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Oct 10 10:13:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2845144114' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:13:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:48.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:13:49 compute-2 nova_compute[235775]: 2025-10-10 10:13:49.830 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:49.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:50 compute-2 ceph-mon[74913]: pgmap v836: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:13:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:13:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:13:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:50 compute-2 nova_compute[235775]: 2025-10-10 10:13:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:52.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:52 compute-2 ceph-mon[74913]: pgmap v837: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Oct 10 10:13:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.854 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:13:52 compute-2 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:13:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:13:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2920225613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.280 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:13:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.463 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5195MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.568 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.568 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:13:53 compute-2 nova_compute[235775]: 2025-10-10 10:13:53.584 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:13:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:54.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:13:54 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3365250461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:54 compute-2 ceph-mon[74913]: pgmap v838: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Oct 10 10:13:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2920225613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1533079177' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3365250461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:54 compute-2 nova_compute[235775]: 2025-10-10 10:13:54.048 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:13:54 compute-2 nova_compute[235775]: 2025-10-10 10:13:54.053 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:13:54 compute-2 nova_compute[235775]: 2025-10-10 10:13:54.068 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:13:54 compute-2 nova_compute[235775]: 2025-10-10 10:13:54.069 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:13:54 compute-2 nova_compute[235775]: 2025-10-10 10:13:54.069 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:13:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1769663104' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:55 compute-2 nova_compute[235775]: 2025-10-10 10:13:55.065 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:55 compute-2 nova_compute[235775]: 2025-10-10 10:13:55.080 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:55 compute-2 nova_compute[235775]: 2025-10-10 10:13:55.081 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:13:55 compute-2 nova_compute[235775]: 2025-10-10 10:13:55.081 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:13:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:13:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:55 compute-2 podman[240844]: 2025-10-10 10:13:55.786638953 +0000 UTC m=+0.062505351 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001)
Oct 10 10:13:55 compute-2 podman[240846]: 2025-10-10 10:13:55.801653897 +0000 UTC m=+0.064363641 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:13:55 compute-2 podman[240845]: 2025-10-10 10:13:55.81855471 +0000 UTC m=+0.087283798 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 10:13:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:56.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:56 compute-2 ceph-mon[74913]: pgmap v839: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Oct 10 10:13:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1438646568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:56.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2710916285' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:13:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:58.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:58 compute-2 ceph-mon[74913]: pgmap v840: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Oct 10 10:13:58 compute-2 sudo[240907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:13:58 compute-2 sudo[240907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:13:58 compute-2 sudo[240907]: pam_unix(sudo:session): session closed for user root
Oct 10 10:13:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:13:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:13:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:58.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:13:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:13:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:13:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:00.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:00 compute-2 ceph-mon[74913]: pgmap v841: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Oct 10 10:14:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:00.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:02.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:02 compute-2 ceph-mon[74913]: pgmap v842: 353 pgs: 353 active+clean; 88 MiB data, 277 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Oct 10 10:14:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:14:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:02.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:04.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:04 compute-2 ceph-mon[74913]: pgmap v843: 353 pgs: 353 active+clean; 116 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 126 op/s
Oct 10 10:14:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:04.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:05 compute-2 unix_chkpwd[240945]: password check failed for user (root)
Oct 10 10:14:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:05 compute-2 sshd-session[240942]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:06.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:06 compute-2 ceph-mon[74913]: pgmap v844: 353 pgs: 353 active+clean; 116 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 159 KiB/s rd, 2.1 MiB/s wr, 53 op/s
Oct 10 10:14:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:06.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:07 compute-2 sshd-session[240942]: Failed password for root from 193.46.255.99 port 36490 ssh2
Oct 10 10:14:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:07 compute-2 podman[240948]: 2025-10-10 10:14:07.784576386 +0000 UTC m=+0.057779500 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct 10 10:14:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:08.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:08 compute-2 ceph-mon[74913]: pgmap v845: 353 pgs: 353 active+clean; 116 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 159 KiB/s rd, 2.1 MiB/s wr, 53 op/s
Oct 10 10:14:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:08.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:09 compute-2 unix_chkpwd[240969]: password check failed for user (root)
Oct 10 10:14:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:10.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:10 compute-2 ceph-mon[74913]: pgmap v846: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 187 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Oct 10 10:14:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:10.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:11 compute-2 sshd-session[240942]: Failed password for root from 193.46.255.99 port 36490 ssh2
Oct 10 10:14:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:12.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:12 compute-2 ceph-mon[74913]: pgmap v847: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 187 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Oct 10 10:14:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:12.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101413 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:13 compute-2 unix_chkpwd[240974]: password check failed for user (root)
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:14.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:14 compute-2 ceph-mon[74913]: pgmap v848: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 192 KiB/s rd, 2.1 MiB/s wr, 58 op/s
Oct 10 10:14:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:14.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:15 compute-2 sshd-session[240942]: Failed password for root from 193.46.255.99 port 36490 ssh2
Oct 10 10:14:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:16.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:16 compute-2 ceph-mon[74913]: pgmap v849: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 35 KiB/s wr, 5 op/s
Oct 10 10:14:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:16.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.102 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.103 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.119 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 10 10:14:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.206 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.206 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.213 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.213 2 INFO nova.compute.claims [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Claim successful on node compute-2.ctlplane.example.com
Oct 10 10:14:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:14:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.330 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:17 compute-2 sshd-session[240942]: Received disconnect from 193.46.255.99 port 36490:11:  [preauth]
Oct 10 10:14:17 compute-2 sshd-session[240942]: Disconnected from authenticating user root 193.46.255.99 port 36490 [preauth]
Oct 10 10:14:17 compute-2 sshd-session[240942]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:17 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:14:17 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3515183494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.837 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:17 compute-2 nova_compute[235775]: 2025-10-10 10:14:17.844 2 DEBUG nova.compute.provider_tree [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:14:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:18.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.095 2 DEBUG nova.scheduler.client.report [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.116 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.117 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.158 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.158 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.189 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.209 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 10 10:14:18 compute-2 ceph-mon[74913]: pgmap v850: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 35 KiB/s wr, 5 op/s
Oct 10 10:14:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3515183494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:18 compute-2 unix_chkpwd[241003]: password check failed for user (root)
Oct 10 10:14:18 compute-2 sshd-session[240999]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.305 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.307 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.308 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating image(s)
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.347 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.386 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.416 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.420 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.421 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:18 compute-2 sudo[241058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:14:18 compute-2 sudo[241058]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:18 compute-2 sudo[241058]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.694 2 DEBUG nova.virt.libvirt.imagebackend [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image locations are: [{'url': 'rbd://21f084a3-af34-5230-afe4-ea5cd24a55f4/images/5ae78700-970d-45b4-a57d-978a054c7519/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://21f084a3-af34-5230-afe4-ea5cd24a55f4/images/5ae78700-970d-45b4-a57d-978a054c7519/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Oct 10 10:14:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:18.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.811 2 WARNING oslo_policy.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.811 2 WARNING oslo_policy.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Oct 10 10:14:18 compute-2 nova_compute[235775]: 2025-10-10 10:14:18.816 2 DEBUG nova.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 10 10:14:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.491 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.561 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.563 2 DEBUG nova.virt.images [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] 5ae78700-970d-45b4-a57d-978a054c7519 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.565 2 DEBUG nova.privsep.utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.566 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.788 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.795 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.862 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.863 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.888 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:19 compute-2 nova_compute[235775]: 2025-10-10 10:14:19.891 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:20.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.045 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Successfully created port: be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 10 10:14:20 compute-2 sshd-session[240999]: Failed password for root from 193.46.255.99 port 13116 ssh2
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.185 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.270 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Oct 10 10:14:20 compute-2 ceph-mon[74913]: pgmap v851: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 39 KiB/s wr, 5 op/s
Oct 10 10:14:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:20 compute-2 unix_chkpwd[241197]: password check failed for user (root)
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.410 2 DEBUG nova.objects.instance [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.423 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.424 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Ensure instance console log exists: /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.424 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.425 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:20 compute-2 nova_compute[235775]: 2025-10-10 10:14:20.425 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:20.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.292 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Successfully updated port: be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.312 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.313 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.313 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 10 10:14:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.429 2 DEBUG nova.compute.manager [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.429 2 DEBUG nova.compute.manager [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing instance network info cache due to event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.430 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:14:21 compute-2 nova_compute[235775]: 2025-10-10 10:14:21.497 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 10 10:14:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:22.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:22 compute-2 sshd-session[240999]: Failed password for root from 193.46.255.99 port 13116 ssh2
Oct 10 10:14:22 compute-2 ceph-mon[74913]: pgmap v852: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 5.1 KiB/s rd, 16 KiB/s wr, 0 op/s
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.347 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.366 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance network_info: |[{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.370 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start _get_guest_xml network_info=[{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.374 2 WARNING nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.378 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.379 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.383 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.383 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.384 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.384 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.387 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.390 2 DEBUG nova.privsep.utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.390 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:22 compute-2 unix_chkpwd[241212]: password check failed for user (root)
Oct 10 10:14:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:22.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:22 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:14:22 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1953104862' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.828 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.866 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:22 compute-2 nova_compute[235775]: 2025-10-10 10:14:22.870 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:23 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:14:23 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1711456005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.297 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.299 2 DEBUG nova.virt.libvirt.vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:14:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.299 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.301 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.304 2 DEBUG nova.objects.instance [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:14:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1953104862' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:14:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1711456005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.330 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] End _get_guest_xml xml=<domain type="kvm">
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <uuid>f6ec6baf-a91e-4c7e-b1cf-b176d952068f</uuid>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <name>instance-00000005</name>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <memory>131072</memory>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <vcpu>1</vcpu>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <metadata>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:name>tempest-TestNetworkBasicOps-server-1362038391</nova:name>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:creationTime>2025-10-10 10:14:22</nova:creationTime>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:flavor name="m1.nano">
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:memory>128</nova:memory>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:disk>1</nova:disk>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:swap>0</nova:swap>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:ephemeral>0</nova:ephemeral>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:vcpus>1</nova:vcpus>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </nova:flavor>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:owner>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </nova:owner>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <nova:ports>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <nova:port uuid="be812d6f-78ad-4f90-9cd0-0ae2444e7f71">
Oct 10 10:14:23 compute-2 nova_compute[235775]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         </nova:port>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </nova:ports>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </nova:instance>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </metadata>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <sysinfo type="smbios">
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <system>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="manufacturer">RDO</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="product">OpenStack Compute</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="serial">f6ec6baf-a91e-4c7e-b1cf-b176d952068f</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="uuid">f6ec6baf-a91e-4c7e-b1cf-b176d952068f</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <entry name="family">Virtual Machine</entry>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </system>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </sysinfo>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <os>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <boot dev="hd"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <smbios mode="sysinfo"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </os>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <features>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <acpi/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <apic/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <vmcoreinfo/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </features>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <clock offset="utc">
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <timer name="pit" tickpolicy="delay"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <timer name="hpet" present="no"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </clock>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <cpu mode="host-model" match="exact">
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <topology sockets="1" cores="1" threads="1"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <disk type="network" device="disk">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk">
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </source>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <target dev="vda" bus="virtio"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <disk type="network" device="cdrom">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config">
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </source>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:14:23 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <target dev="sda" bus="sata"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <interface type="ethernet">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <mac address="fa:16:3e:35:91:37"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <driver name="vhost" rx_queue_size="512"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <mtu size="1442"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <target dev="tapbe812d6f-78"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <serial type="pty">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <log file="/var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/console.log" append="off"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </serial>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <video>
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </video>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <input type="tablet" bus="usb"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <rng model="virtio">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <backend model="random">/dev/urandom</backend>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <controller type="usb" index="0"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     <memballoon model="virtio">
Oct 10 10:14:23 compute-2 nova_compute[235775]:       <stats period="10"/>
Oct 10 10:14:23 compute-2 nova_compute[235775]:     </memballoon>
Oct 10 10:14:23 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:14:23 compute-2 nova_compute[235775]: </domain>
Oct 10 10:14:23 compute-2 nova_compute[235775]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.333 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Preparing to wait for external event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.334 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.335 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.335 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.336 2 DEBUG nova.virt.libvirt.vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:14:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.336 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.337 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.338 2 DEBUG os_vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.383 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updated VIF entry in instance network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.383 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.392 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.393 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.393 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLOUT] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.398 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:14:23 compute-2 nova_compute[235775]: 2025-10-10 10:14:23.410 2 INFO oslo.privsep.daemon [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpd1vgorxc/privsep.sock']
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:24.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.122 2 INFO oslo.privsep.daemon [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Spawned new privsep daemon via rootwrap
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.021 697 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.029 697 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.033 697 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.034 697 INFO oslo.privsep.daemon [-] privsep daemon running as pid 697
Oct 10 10:14:24 compute-2 ceph-mon[74913]: pgmap v853: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 35 op/s
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.512 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe812d6f-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.513 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe812d6f-78, col_values=(('external_ids', {'iface-id': 'be812d6f-78ad-4f90-9cd0-0ae2444e7f71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:91:37', 'vm-uuid': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:24 compute-2 NetworkManager[44866]: <info>  [1760091264.5163] manager: (tapbe812d6f-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.524 2 INFO os_vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78')
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.573 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.574 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.574 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:35:91:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.574 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Using config drive
Oct 10 10:14:24 compute-2 nova_compute[235775]: 2025-10-10 10:14:24.604 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:24 compute-2 sshd-session[240999]: Failed password for root from 193.46.255.99 port 13116 ssh2
Oct 10 10:14:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:24.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.018 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating config drive at /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.030 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj48e1hdh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.180 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj48e1hdh" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.212 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.215 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.354 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.356 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deleting local config drive /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config because it was imported into RBD.
Oct 10 10:14:25 compute-2 ceph-mon[74913]: pgmap v854: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:14:25 compute-2 systemd[1]: Starting libvirt secret daemon...
Oct 10 10:14:25 compute-2 systemd[1]: Started libvirt secret daemon.
Oct 10 10:14:25 compute-2 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 10 10:14:25 compute-2 kernel: tapbe812d6f-78: entered promiscuous mode
Oct 10 10:14:25 compute-2 NetworkManager[44866]: <info>  [1760091265.4758] manager: (tapbe812d6f-78): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Oct 10 10:14:25 compute-2 ovn_controller[132503]: 2025-10-10T10:14:25Z|00027|binding|INFO|Claiming lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for this chassis.
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:25 compute-2 ovn_controller[132503]: 2025-10-10T10:14:25Z|00028|binding|INFO|be812d6f-78ad-4f90-9cd0-0ae2444e7f71: Claiming fa:16:3e:35:91:37 10.100.0.11
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:25 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.497 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:91:37 10.100.0.11'], port_security=['fa:16:3e:35:91:37 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2502283d-b38d-456e-8e7f-133a87baf32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e2152f-e965-46e3-9774-988f8fdf189b, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=be812d6f-78ad-4f90-9cd0-0ae2444e7f71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:14:25 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.499 141795 INFO neutron.agent.ovn.metadata.agent [-] Port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 in datapath c8850c4c-dc38-4440-9c03-f2dd59684fe6 bound to our chassis
Oct 10 10:14:25 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.502 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8850c4c-dc38-4440-9c03-f2dd59684fe6
Oct 10 10:14:25 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.505 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp5qd_lucz/privsep.sock']
Oct 10 10:14:25 compute-2 systemd-machined[192768]: New machine qemu-1-instance-00000005.
Oct 10 10:14:25 compute-2 systemd-udevd[241382]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:14:25 compute-2 NetworkManager[44866]: <info>  [1760091265.5612] device (tapbe812d6f-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 10:14:25 compute-2 NetworkManager[44866]: <info>  [1760091265.5626] device (tapbe812d6f-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 10:14:25 compute-2 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:25 compute-2 ovn_controller[132503]: 2025-10-10T10:14:25Z|00029|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 ovn-installed in OVS
Oct 10 10:14:25 compute-2 ovn_controller[132503]: 2025-10-10T10:14:25Z|00030|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 up in Southbound
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.956 2 DEBUG nova.compute.manager [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.956 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.957 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.957 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:25 compute-2 nova_compute[235775]: 2025-10-10 10:14:25.958 2 DEBUG nova.compute.manager [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Processing event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 10 10:14:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:26.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.207 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.208 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5qd_lucz/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.100 241439 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.104 241439 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.107 241439 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.108 241439 INFO oslo.privsep.daemon [-] privsep daemon running as pid 241439
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.212 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[1e2a4b94-7753-4efe-97a6-ef25e3e01843]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:26 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:14:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:26 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:14:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:14:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:14:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:14:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:14:26 compute-2 sshd-session[240999]: Received disconnect from 193.46.255.99 port 13116:11:  [preauth]
Oct 10 10:14:26 compute-2 sshd-session[240999]: Disconnected from authenticating user root 193.46.255.99 port 13116 [preauth]
Oct 10 10:14:26 compute-2 sshd-session[240999]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.492 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 10 10:14:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:14:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.493 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.4920683, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.493 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Started (Lifecycle Event)
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.505 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.509 2 INFO nova.virt.libvirt.driver [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance spawned successfully.
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.509 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.513 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.516 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.526 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.526 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.528 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.548 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.549 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.4930215, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.549 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Paused (Lifecycle Event)
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.571 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.575 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.495544, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.575 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Resumed (Lifecycle Event)
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.582 2 INFO nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 8.28 seconds to spawn the instance on the hypervisor.
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.583 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.593 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.598 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.621 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.655 2 INFO nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 9.49 seconds to build instance.
Oct 10 10:14:26 compute-2 nova_compute[235775]: 2025-10-10 10:14:26.677 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:26.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:26 compute-2 podman[241447]: 2025-10-10 10:14:26.805078474 +0000 UTC m=+0.071484580 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:14:26 compute-2 podman[241449]: 2025-10-10 10:14:26.819227319 +0000 UTC m=+0.082743402 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:14:26 compute-2 podman[241448]: 2025-10-10 10:14:26.822895357 +0000 UTC m=+0.089612983 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:26 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:27 compute-2 unix_chkpwd[241511]: password check failed for user (root)
Oct 10 10:14:27 compute-2 sshd-session[241445]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:27 compute-2 ceph-mon[74913]: pgmap v855: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Oct 10 10:14:27 compute-2 nova_compute[235775]: 2025-10-10 10:14:27.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.866 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0b7062-ae11-445e-8982-237196222505]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.868 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8850c4c-d1 in ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.870 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8850c4c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.870 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[7e87bbae-570d-40df-b30b-f7ea5e4116f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.875 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[3749dbc3-30dc-46a6-ad15-97ebc956108a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.907 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4f8f2d-faa1-499f-a3b0-194e05985770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.941 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[323322d2-c8a5-4cc0-9b99-5bd605869e85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:27 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.944 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp9fmkvg9o/privsep.sock']
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.042 2 DEBUG nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.043 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.043 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.044 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.044 2 DEBUG nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:14:28 compute-2 nova_compute[235775]: 2025-10-10 10:14:28.044 2 WARNING nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received unexpected event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with vm_state active and task_state None.
Oct 10 10:14:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:28.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.690 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.692 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9fmkvg9o/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.581 241521 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.587 241521 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.590 241521 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.591 241521 INFO oslo.privsep.daemon [-] privsep daemon running as pid 241521
Oct 10 10:14:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.695 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[88a17650-70b2-478e-bdf7-587900d068e9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:28.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:29 compute-2 sshd-session[241445]: Failed password for root from 193.46.255.99 port 63336 ssh2
Oct 10 10:14:29 compute-2 nova_compute[235775]: 2025-10-10 10:14:29.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.783 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[301b1efe-413a-4940-bef6-789a925536df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.789 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[badf5fcc-fdde-45f3-8c8d-795605347709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 NetworkManager[44866]: <info>  [1760091269.7906] manager: (tapc8850c4c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Oct 10 10:14:29 compute-2 systemd-udevd[241533]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.822 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d98cc5-4ccd-445e-a92e-4653bdd24065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.827 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[f557582d-34f1-4ce5-890f-41c43154f1a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 NetworkManager[44866]: <info>  [1760091269.8580] device (tapc8850c4c-d0): carrier: link connected
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.863 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3e4673-1b12-47bb-bbc8-3347ec1d7632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.887 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[62343da3-4007-4ed7-b09b-efb64de195d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8850c4c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:14:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417341, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241552, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.906 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[060d29f0-e432-4ef2-b0a3-44caf5ead820]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:1444'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417341, 'tstamp': 417341}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241553, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.924 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[3035d08a-b26c-41f3-b54d-6532e02e2962]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8850c4c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:14:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417341, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241554, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:29 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.956 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[165bddf4-2694-4ebd-a507-81bfdb258528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.007 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[488118f4-9e0a-4e78-81f8-a9493fc87197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.010 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8850c4c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.010 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.011 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8850c4c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:30 compute-2 NetworkManager[44866]: <info>  [1760091270.0143] manager: (tapc8850c4c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct 10 10:14:30 compute-2 kernel: tapc8850c4c-d0: entered promiscuous mode
Oct 10 10:14:30 compute-2 nova_compute[235775]: 2025-10-10 10:14:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:30 compute-2 nova_compute[235775]: 2025-10-10 10:14:30.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.018 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8850c4c-d0, col_values=(('external_ids', {'iface-id': '185907ee-d118-486d-93ad-c5a1b6a3a149'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:30 compute-2 nova_compute[235775]: 2025-10-10 10:14:30.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:30 compute-2 ovn_controller[132503]: 2025-10-10T10:14:30Z|00031|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 10:14:30 compute-2 ceph-mon[74913]: pgmap v856: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 3.1 MiB/s rd, 1.8 MiB/s wr, 91 op/s
Oct 10 10:14:30 compute-2 nova_compute[235775]: 2025-10-10 10:14:30.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.049 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.050 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cfeec1-5c41-4aa2-8c52-3a77df4c3853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:30.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.052 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: global
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     log         /dev/log local0 debug
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     log-tag     haproxy-metadata-proxy-c8850c4c-dc38-4440-9c03-f2dd59684fe6
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     user        root
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     group       root
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     maxconn     1024
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     pidfile     /var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     daemon
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: defaults
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     log global
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     mode http
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     option httplog
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     option dontlognull
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     option http-server-close
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     option forwardfor
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     retries                 3
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     timeout http-request    30s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     timeout connect         30s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     timeout client          32s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     timeout server          32s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     timeout http-keep-alive 30s
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: listen listener
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     bind 169.254.169.254:80
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     server metadata /var/lib/neutron/metadata_proxy
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:     http-request add-header X-OVN-Network-ID c8850c4c-dc38-4440-9c03-f2dd59684fe6
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 10 10:14:30 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.052 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'env', 'PROCESS_TAG=haproxy-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8850c4c-dc38-4440-9c03-f2dd59684fe6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 10 10:14:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:30 compute-2 podman[241587]: 2025-10-10 10:14:30.420203979 +0000 UTC m=+0.054687620 container create 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct 10 10:14:30 compute-2 systemd[1]: Started libpod-conmon-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope.
Oct 10 10:14:30 compute-2 podman[241587]: 2025-10-10 10:14:30.389720729 +0000 UTC m=+0.024204390 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 10:14:30 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:14:30 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62fcae525be285a1d8adf5d06c7c663fa56b70679788d48c992ce41c622e09da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 10:14:30 compute-2 podman[241587]: 2025-10-10 10:14:30.514960896 +0000 UTC m=+0.149444567 container init 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 10:14:30 compute-2 podman[241587]: 2025-10-10 10:14:30.524128462 +0000 UTC m=+0.158612113 container start 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 10 10:14:30 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : New worker (241609) forked
Oct 10 10:14:30 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : Loading success.
Oct 10 10:14:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:30.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 10:14:31 compute-2 unix_chkpwd[241620]: password check failed for user (root)
Oct 10 10:14:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:32 compute-2 ceph-mon[74913]: pgmap v857: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 3.1 MiB/s rd, 1.8 MiB/s wr, 91 op/s
Oct 10 10:14:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:14:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:32.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:32 compute-2 nova_compute[235775]: 2025-10-10 10:14:32.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:32.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:32 compute-2 sshd-session[241445]: Failed password for root from 193.46.255.99 port 63336 ssh2
Oct 10 10:14:32 compute-2 ovn_controller[132503]: 2025-10-10T10:14:32Z|00032|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9525] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Oct 10 10:14:32 compute-2 nova_compute[235775]: 2025-10-10 10:14:32.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9532] device (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9541] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9544] device (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9551] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9555] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9558] device (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 10:14:32 compute-2 NetworkManager[44866]: <info>  [1760091272.9564] device (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 10:14:32 compute-2 ovn_controller[132503]: 2025-10-10T10:14:32Z|00033|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 10:14:32 compute-2 nova_compute[235775]: 2025-10-10 10:14:32.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:32 compute-2 nova_compute[235775]: 2025-10-10 10:14:32.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy ignored for local
Oct 10 10:14:33 compute-2 kernel: ganesha.nfsd[240938]: segfault at 50 ip 00007fd18c6b432e sp 00007fd145ffa210 error 4 in libntirpc.so.5.8[7fd18c699000+2c000] likely on CPU 3 (core 0, socket 3)
Oct 10 10:14:33 compute-2 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 10:14:33 compute-2 systemd[1]: Started Process Core Dump (PID 241626/UID 0).
Oct 10 10:14:33 compute-2 nova_compute[235775]: 2025-10-10 10:14:33.228 2 DEBUG nova.compute.manager [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:33 compute-2 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG nova.compute.manager [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing instance network info cache due to event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:14:33 compute-2 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:14:33 compute-2 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:14:33 compute-2 nova_compute[235775]: 2025-10-10 10:14:33.230 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:14:33 compute-2 unix_chkpwd[241628]: password check failed for user (root)
Oct 10 10:14:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:34.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:34 compute-2 ceph-mon[74913]: pgmap v858: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 109 op/s
Oct 10 10:14:34 compute-2 sudo[241629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:14:34 compute-2 sudo[241629]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:34 compute-2 sudo[241629]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:34 compute-2 nova_compute[235775]: 2025-10-10 10:14:34.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:34 compute-2 sudo[241654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:14:34 compute-2 sudo[241654]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:34 compute-2 systemd-coredump[241627]: Process 240468 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 58:
                                                    #0  0x00007fd18c6b432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Oct 10 10:14:34 compute-2 nova_compute[235775]: 2025-10-10 10:14:34.671 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updated VIF entry in instance network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:14:34 compute-2 nova_compute[235775]: 2025-10-10 10:14:34.671 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:14:34 compute-2 nova_compute[235775]: 2025-10-10 10:14:34.691 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:14:34 compute-2 systemd[1]: systemd-coredump@11-241626-0.service: Deactivated successfully.
Oct 10 10:14:34 compute-2 systemd[1]: systemd-coredump@11-241626-0.service: Consumed 1.474s CPU time.
Oct 10 10:14:34 compute-2 podman[241684]: 2025-10-10 10:14:34.761434837 +0000 UTC m=+0.029067985 container died 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1)
Oct 10 10:14:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:34 compute-2 systemd[1]: var-lib-containers-storage-overlay-8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204-merged.mount: Deactivated successfully.
Oct 10 10:14:34 compute-2 podman[241684]: 2025-10-10 10:14:34.809365579 +0000 UTC m=+0.076998697 container remove 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:14:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:34.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 10:14:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 10:14:34 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.298s CPU time.
Oct 10 10:14:35 compute-2 sudo[241654]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:35 compute-2 sshd-session[241445]: Failed password for root from 193.46.255.99 port 63336 ssh2
Oct 10 10:14:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: pgmap v859: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 75 op/s
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:14:35 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:14:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:36.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:36.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:37 compute-2 sshd-session[241445]: Received disconnect from 193.46.255.99 port 63336:11:  [preauth]
Oct 10 10:14:37 compute-2 sshd-session[241445]: Disconnected from authenticating user root 193.46.255.99 port 63336 [preauth]
Oct 10 10:14:37 compute-2 sshd-session[241445]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 10 10:14:37 compute-2 nova_compute[235775]: 2025-10-10 10:14:37.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:38 compute-2 ceph-mon[74913]: pgmap v860: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 75 op/s
Oct 10 10:14:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:38 compute-2 sudo[241765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:14:38 compute-2 sudo[241765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:38 compute-2 sudo[241765]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:38 compute-2 ovn_controller[132503]: 2025-10-10T10:14:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:91:37 10.100.0.11
Oct 10 10:14:38 compute-2 ovn_controller[132503]: 2025-10-10T10:14:38Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:91:37 10.100.0.11
Oct 10 10:14:38 compute-2 podman[241789]: 2025-10-10 10:14:38.632515956 +0000 UTC m=+0.048160240 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 10:14:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:38.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101439 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/101439 (4) : haproxy version is 2.3.17-d1c9119
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/101439 (4) : path to executable is /usr/local/sbin/haproxy
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [ALERT] 282/101439 (4) : backend 'backend' has no server available!
Oct 10 10:14:39 compute-2 nova_compute[235775]: 2025-10-10 10:14:39.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:40 compute-2 ceph-mon[74913]: pgmap v861: 353 pgs: 353 active+clean; 188 MiB data, 333 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 111 op/s
Oct 10 10:14:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:40.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:40.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:40 compute-2 sudo[241813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:14:40 compute-2 sudo[241813]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:40 compute-2 sudo[241813]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.467 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.468 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.468 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:14:41 compute-2 ceph-mon[74913]: pgmap v862: 353 pgs: 353 active+clean; 188 MiB data, 333 MiB used, 60 GiB / 60 GiB avail; 770 KiB/s rd, 2.0 MiB/s wr, 54 op/s
Oct 10 10:14:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:42.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:42 compute-2 nova_compute[235775]: 2025-10-10 10:14:42.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:42.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:43 compute-2 nova_compute[235775]: 2025-10-10 10:14:43.719 2 INFO nova.compute.manager [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Get console output
Oct 10 10:14:43 compute-2 nova_compute[235775]: 2025-10-10 10:14:43.725 2 INFO oslo.privsep.daemon [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp3bylatxb/privsep.sock']
Oct 10 10:14:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:44 compute-2 ceph-mon[74913]: pgmap v863: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 917 KiB/s rd, 2.1 MiB/s wr, 83 op/s
Oct 10 10:14:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:44.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.372 2 INFO oslo.privsep.daemon [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Spawned new privsep daemon via rootwrap
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.254 763 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.258 763 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.260 763 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.260 763 INFO oslo.privsep.daemon [-] privsep daemon running as pid 763
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.457 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.808 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.809 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.809 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.810 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.810 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.812 2 INFO nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Terminating instance
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.813 2 DEBUG nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 10 10:14:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:44.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:44 compute-2 kernel: tapbe812d6f-78 (unregistering): left promiscuous mode
Oct 10 10:14:44 compute-2 NetworkManager[44866]: <info>  [1760091284.8621] device (tapbe812d6f-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 10:14:44 compute-2 ovn_controller[132503]: 2025-10-10T10:14:44Z|00034|binding|INFO|Releasing lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 from this chassis (sb_readonly=0)
Oct 10 10:14:44 compute-2 ovn_controller[132503]: 2025-10-10T10:14:44Z|00035|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 down in Southbound
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:44 compute-2 ovn_controller[132503]: 2025-10-10T10:14:44Z|00036|binding|INFO|Removing iface tapbe812d6f-78 ovn-installed in OVS
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.879 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:91:37 10.100.0.11'], port_security=['fa:16:3e:35:91:37 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2502283d-b38d-456e-8e7f-133a87baf32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e2152f-e965-46e3-9774-988f8fdf189b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=be812d6f-78ad-4f90-9cd0-0ae2444e7f71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:14:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.880 141795 INFO neutron.agent.ovn.metadata.agent [-] Port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 in datapath c8850c4c-dc38-4440-9c03-f2dd59684fe6 unbound from our chassis
Oct 10 10:14:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.881 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8850c4c-dc38-4440-9c03-f2dd59684fe6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 10 10:14:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.882 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[9211e87c-afc0-49c3-bb2c-e1e0a7b3dd81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.882 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 namespace which is not needed anymore
Oct 10 10:14:44 compute-2 nova_compute[235775]: 2025-10-10 10:14:44.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:44 compute-2 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 10 10:14:44 compute-2 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 12.673s CPU time.
Oct 10 10:14:44 compute-2 systemd-machined[192768]: Machine qemu-1-instance-00000005 terminated.
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : haproxy version is 2.8.14-c23fe91
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : path to executable is /usr/sbin/haproxy
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : Exiting Master process...
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : Exiting Master process...
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [ALERT]    (241607) : Current worker (241609) exited with code 143 (Terminated)
Oct 10 10:14:45 compute-2 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : All workers exited. Exiting... (0)
Oct 10 10:14:45 compute-2 systemd[1]: libpod-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope: Deactivated successfully.
Oct 10 10:14:45 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 12.
Oct 10 10:14:45 compute-2 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:14:45 compute-2 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.298s CPU time.
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.057 2 INFO nova.virt.libvirt.driver [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance destroyed successfully.
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.058 2 DEBUG nova.objects.instance [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:14:45 compute-2 podman[241873]: 2025-10-10 10:14:45.060252725 +0000 UTC m=+0.055451636 container died 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 10:14:45 compute-2 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.071 2 DEBUG nova.virt.libvirt.vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:14:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:14:26Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.071 2 DEBUG nova.network.os_vif_util [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.072 2 DEBUG nova.network.os_vif_util [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.072 2 DEBUG os_vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.074 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe812d6f-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.081 2 INFO os_vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78')
Oct 10 10:14:45 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9-userdata-shm.mount: Deactivated successfully.
Oct 10 10:14:45 compute-2 systemd[1]: var-lib-containers-storage-overlay-62fcae525be285a1d8adf5d06c7c663fa56b70679788d48c992ce41c622e09da-merged.mount: Deactivated successfully.
Oct 10 10:14:45 compute-2 podman[241873]: 2025-10-10 10:14:45.109682424 +0000 UTC m=+0.104881315 container cleanup 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:14:45 compute-2 systemd[1]: libpod-conmon-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope: Deactivated successfully.
Oct 10 10:14:45 compute-2 podman[241937]: 2025-10-10 10:14:45.181537625 +0000 UTC m=+0.046614700 container remove 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.189 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b69bca29-3bc5-485e-be49-ff10788a70fe]: (4, ('Fri Oct 10 10:14:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 (0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9)\n0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9\nFri Oct 10 10:14:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 (0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9)\n0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.191 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d2c09b-3238-4834-b48c-0ab3c002ca37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.191 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8850c4c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 kernel: tapc8850c4c-d0: left promiscuous mode
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.214 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2b9461-327c-444c-89e0-b5a18a42039c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.242 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[07148111-9e27-4dfd-8eb4-19d999636c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.243 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4109441a-534a-4cf8-bc7a-2b6fa38dbf47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.259 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[783c3487-faeb-4e98-822e-3c2469bdaea9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417333, 'reachable_time': 39281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241982, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 systemd[1]: run-netns-ovnmeta\x2dc8850c4c\x2ddc38\x2d4440\x2d9c03\x2df2dd59684fe6.mount: Deactivated successfully.
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.274 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.275 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[237005b9-421f-4741-b0b4-5492ab726e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:14:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:45 compute-2 podman[241989]: 2025-10-10 10:14:45.366574337 +0000 UTC m=+0.065863319 container create eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:14:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 10:14:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:14:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:14:45 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 10:14:45 compute-2 podman[241989]: 2025-10-10 10:14:45.342291726 +0000 UTC m=+0.041580738 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:14:45 compute-2 podman[241989]: 2025-10-10 10:14:45.436961731 +0000 UTC m=+0.136250733 container init eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2)
Oct 10 10:14:45 compute-2 podman[241989]: 2025-10-10 10:14:45.444051349 +0000 UTC m=+0.143340331 container start eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:14:45 compute-2 bash[241989]: eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f
Oct 10 10:14:45 compute-2 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.500 2 INFO nova.virt.libvirt.driver [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deleting instance files /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_del
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.501 2 INFO nova.virt.libvirt.driver [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deletion of /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_del complete
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.574 2 DEBUG nova.virt.libvirt.host [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.574 2 INFO nova.virt.libvirt.host [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] UEFI support detected
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.576 2 INFO nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 0.76 seconds to destroy the instance on the hypervisor.
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG oslo.service.loopingcall [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG nova.network.neutron [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:45 compute-2 nova_compute[235775]: 2025-10-10 10:14:45.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.776 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:14:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.778 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:14:46 compute-2 ceph-mon[74913]: pgmap v864: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:14:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:46.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:14:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:46.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:14:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.065011) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287065056, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6164593, "memory_usage": 6261456, "flush_reason": "Manual Compaction"}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287085962, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3994523, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26113, "largest_seqno": 28466, "table_properties": {"data_size": 3985243, "index_size": 5774, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19477, "raw_average_key_size": 20, "raw_value_size": 3966615, "raw_average_value_size": 4119, "num_data_blocks": 254, "num_entries": 963, "num_filter_entries": 963, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091080, "oldest_key_time": 1760091080, "file_creation_time": 1760091287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21039 microseconds, and 8398 cpu microseconds.
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.086045) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3994523 bytes OK
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.086078) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087636) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087662) EVENT_LOG_v1 {"time_micros": 1760091287087654, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087694) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6154232, prev total WAL file size 6154232, number of live WAL files 2.
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.089441) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3900KB)], [51(11MB)]
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287089496, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16447771, "oldest_snapshot_seqno": -1}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5813 keys, 14325919 bytes, temperature: kUnknown
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287167539, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14325919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14286658, "index_size": 23599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 147737, "raw_average_key_size": 25, "raw_value_size": 14181276, "raw_average_value_size": 2439, "num_data_blocks": 964, "num_entries": 5813, "num_filter_entries": 5813, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.167820) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14325919 bytes
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.169189) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.5 rd, 183.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 11.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6331, records dropped: 518 output_compression: NoCompression
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.169215) EVENT_LOG_v1 {"time_micros": 1760091287169203, "job": 30, "event": "compaction_finished", "compaction_time_micros": 78119, "compaction_time_cpu_micros": 25940, "output_level": 6, "num_output_files": 1, "total_output_size": 14325919, "num_input_records": 6331, "num_output_records": 5813, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287170357, "job": 30, "event": "table_file_deletion", "file_number": 53}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287173358, "job": 30, "event": "table_file_deletion", "file_number": 51}
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.089355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.587 2 DEBUG nova.network.neutron [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.623 2 INFO nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 2.05 seconds to deallocate network for instance.
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.695 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.695 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.739 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.739 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.740 2 WARNING nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received unexpected event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with vm_state deleted and task_state None.
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-deleted-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:14:47 compute-2 nova_compute[235775]: 2025-10-10 10:14:47.762 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:48 compute-2 ceph-mon[74913]: pgmap v865: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:14:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:48.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:14:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1817080526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.194 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.200 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.245 2 ERROR nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [req-469b8aff-522c-4b7a-a079-dfcb7da00766] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID dcdfa54c-9f95-46da-9af1-da3e28d81cf0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-469b8aff-522c-4b7a-a079-dfcb7da00766"}]}
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.270 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.296 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.297 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.314 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.349 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.397 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:14:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2226505712' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.822 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.826 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:14:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:48.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.912 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updated inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.913 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.913 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.940 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:48 compute-2 nova_compute[235775]: 2025-10-10 10:14:48.966 2 INFO nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance f6ec6baf-a91e-4c7e-b1cf-b176d952068f
Oct 10 10:14:49 compute-2 nova_compute[235775]: 2025-10-10 10:14:49.035 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1817080526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2226505712' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:50.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:50 compute-2 ceph-mon[74913]: pgmap v866: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 345 KiB/s rd, 2.1 MiB/s wr, 93 op/s
Oct 10 10:14:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:14:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:50.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.836 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:50 compute-2 nova_compute[235775]: 2025-10-10 10:14:50.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:14:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:14:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:14:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:52.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:52 compute-2 ceph-mon[74913]: pgmap v867: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 167 KiB/s rd, 109 KiB/s wr, 57 op/s
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:52.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:14:52 compute-2 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:14:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1407840113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.268 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.406 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.407 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4880MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.408 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.408 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.486 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.486 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:14:53 compute-2 nova_compute[235775]: 2025-10-10 10:14:53.513 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:14:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:53 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:14:53.781 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:14:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:14:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1389124777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:54 compute-2 nova_compute[235775]: 2025-10-10 10:14:54.013 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:14:54 compute-2 nova_compute[235775]: 2025-10-10 10:14:54.019 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:14:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:54.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:54 compute-2 ceph-mon[74913]: pgmap v868: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 168 KiB/s rd, 109 KiB/s wr, 58 op/s
Oct 10 10:14:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1407840113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1389124777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:54 compute-2 nova_compute[235775]: 2025-10-10 10:14:54.205 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:14:54 compute-2 nova_compute[235775]: 2025-10-10 10:14:54.239 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:14:54 compute-2 nova_compute[235775]: 2025-10-10 10:14:54.239 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:14:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:54.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:55 compute-2 nova_compute[235775]: 2025-10-10 10:14:55.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1987096323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:55 compute-2 nova_compute[235775]: 2025-10-10 10:14:55.237 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:14:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:55 compute-2 nova_compute[235775]: 2025-10-10 10:14:55.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:55 compute-2 nova_compute[235775]: 2025-10-10 10:14:55.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:14:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:56 compute-2 ceph-mon[74913]: pgmap v869: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 14 KiB/s wr, 29 op/s
Oct 10 10:14:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/773403696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4254524373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:56 compute-2 nova_compute[235775]: 2025-10-10 10:14:56.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:14:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:56.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:57 compute-2 nova_compute[235775]: 2025-10-10 10:14:57.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:14:57 compute-2 podman[242149]: 2025-10-10 10:14:57.780489031 +0000 UTC m=+0.059483914 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:14:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:57 compute-2 podman[242151]: 2025-10-10 10:14:57.785452741 +0000 UTC m=+0.056360944 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible)
Oct 10 10:14:57 compute-2 podman[242150]: 2025-10-10 10:14:57.805791215 +0000 UTC m=+0.081168922 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:14:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:14:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:58.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:14:58 compute-2 ceph-mon[74913]: pgmap v870: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 14 KiB/s wr, 29 op/s
Oct 10 10:14:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/688430293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:58 compute-2 sudo[242211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:14:58 compute-2 sudo[242211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:14:58 compute-2 sudo[242211]: pam_unix(sudo:session): session closed for user root
Oct 10 10:14:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:14:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:14:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:58.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:14:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2213807489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:14:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:14:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:00 compute-2 nova_compute[235775]: 2025-10-10 10:15:00.057 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091285.055142, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:15:00 compute-2 nova_compute[235775]: 2025-10-10 10:15:00.057 2 INFO nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Stopped (Lifecycle Event)
Oct 10 10:15:00 compute-2 nova_compute[235775]: 2025-10-10 10:15:00.085 2 DEBUG nova.compute.manager [None req-5d03117d-d595-4d75-bcd1-0a18ab46fbd2 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:15:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:00 compute-2 nova_compute[235775]: 2025-10-10 10:15:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:00.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:00 compute-2 ceph-mon[74913]: pgmap v871: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 15 KiB/s wr, 58 op/s
Oct 10 10:15:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:00.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:02.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:02 compute-2 ceph-mon[74913]: pgmap v872: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 29 op/s
Oct 10 10:15:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:15:02 compute-2 nova_compute[235775]: 2025-10-10 10:15:02.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:02.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:02 compute-2 nova_compute[235775]: 2025-10-10 10:15:02.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:03 compute-2 nova_compute[235775]: 2025-10-10 10:15:03.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:04 compute-2 ceph-mon[74913]: pgmap v873: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.2 KiB/s wr, 29 op/s
Oct 10 10:15:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:04.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:05 compute-2 nova_compute[235775]: 2025-10-10 10:15:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:06.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:06 compute-2 ceph-mon[74913]: pgmap v874: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:06.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:07 compute-2 nova_compute[235775]: 2025-10-10 10:15:07.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:15:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:08.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:15:08 compute-2 ceph-mon[74913]: pgmap v875: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Oct 10 10:15:08 compute-2 podman[242247]: 2025-10-10 10:15:08.776769198 +0000 UTC m=+0.056973782 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 10:15:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:08.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:10 compute-2 nova_compute[235775]: 2025-10-10 10:15:10.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:10.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:10 compute-2 ceph-mon[74913]: pgmap v876: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 29 op/s
Oct 10 10:15:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:12.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:12 compute-2 ceph-mon[74913]: pgmap v877: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:15:12 compute-2 nova_compute[235775]: 2025-10-10 10:15:12.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:12.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:14 compute-2 ceph-mon[74913]: pgmap v878: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:15:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:14.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:15 compute-2 nova_compute[235775]: 2025-10-10 10:15:15.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:15 compute-2 ceph-mon[74913]: pgmap v879: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:15:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:16.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:16.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:17 compute-2 ceph-mon[74913]: pgmap v880: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:15:17 compute-2 nova_compute[235775]: 2025-10-10 10:15:17.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:18.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:18 compute-2 sudo[242276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:15:18 compute-2 sudo[242276]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:18 compute-2 sudo[242276]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:18.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:15:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5489 writes, 28K keys, 5489 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5489 writes, 5489 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1548 writes, 7390 keys, 1548 commit groups, 1.0 writes per commit group, ingest: 16.91 MB, 0.03 MB/s
                                           Interval WAL: 1548 writes, 1548 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    134.5      0.32              0.12        15    0.021       0      0       0.0       0.0
                                             L6      1/0   13.66 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.1    175.0    150.0      1.18              0.46        14    0.084     73K   7381       0.0       0.0
                                            Sum      1/0   13.66 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.1    137.5    146.7      1.50              0.58        29    0.052     73K   7381       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.9    158.0    161.0      0.47              0.22        10    0.047     30K   2558       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    175.0    150.0      1.18              0.46        14    0.084     73K   7381       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    135.3      0.32              0.12        14    0.023       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.042, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.21 GB write, 0.12 MB/s write, 0.20 GB read, 0.11 MB/s read, 1.5 seconds
                                           Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 17.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000153 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(931,16.87 MB,5.5481%) FilterBlock(29,219.23 KB,0.0704263%) IndexBlock(29,378.61 KB,0.121624%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Oct 10 10:15:20 compute-2 ceph-mon[74913]: pgmap v881: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:15:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3825464269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:20 compute-2 nova_compute[235775]: 2025-10-10 10:15:20.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:20.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:20.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:22 compute-2 ceph-mon[74913]: pgmap v882: 353 pgs: 353 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:15:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:22.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:22 compute-2 nova_compute[235775]: 2025-10-10 10:15:22.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:22.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:24 compute-2 ceph-mon[74913]: pgmap v883: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:15:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:24.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:24.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:25 compute-2 nova_compute[235775]: 2025-10-10 10:15:25.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:26 compute-2 ceph-mon[74913]: pgmap v884: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:15:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3004183643' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:15:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3003324697' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:15:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:26.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:26.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3262215243' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:15:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3262215243' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:15:27 compute-2 nova_compute[235775]: 2025-10-10 10:15:27.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:28 compute-2 ceph-mon[74913]: pgmap v885: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:15:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:28.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:28 compute-2 podman[242313]: 2025-10-10 10:15:28.8035987 +0000 UTC m=+0.068123213 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:15:28 compute-2 podman[242312]: 2025-10-10 10:15:28.805964515 +0000 UTC m=+0.074795016 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:15:28 compute-2 podman[242311]: 2025-10-10 10:15:28.826706252 +0000 UTC m=+0.090018495 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:15:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:28.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:30 compute-2 nova_compute[235775]: 2025-10-10 10:15:30.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:30 compute-2 ceph-mon[74913]: pgmap v886: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 1.8 MiB/s wr, 33 op/s
Oct 10 10:15:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:30.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:30.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:32.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:32 compute-2 ceph-mon[74913]: pgmap v887: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1.8 MiB/s wr, 33 op/s
Oct 10 10:15:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:15:32 compute-2 nova_compute[235775]: 2025-10-10 10:15:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:32.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:15:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:34.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:15:34 compute-2 ceph-mon[74913]: pgmap v888: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Oct 10 10:15:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:34.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:35 compute-2 nova_compute[235775]: 2025-10-10 10:15:35.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 10:15:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:36.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:36 compute-2 ceph-mon[74913]: pgmap v889: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:36.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:37 compute-2 nova_compute[235775]: 2025-10-10 10:15:37.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:38.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:38 compute-2 ceph-mon[74913]: pgmap v890: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:15:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:38.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:38 compute-2 sudo[242388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:15:38 compute-2 sudo[242388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:38 compute-2 sudo[242388]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:38 compute-2 podman[242413]: 2025-10-10 10:15:38.979043715 +0000 UTC m=+0.064802315 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:15:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:39 compute-2 ovn_controller[132503]: 2025-10-10T10:15:39Z|00037|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct 10 10:15:40 compute-2 nova_compute[235775]: 2025-10-10 10:15:40.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:40.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:40 compute-2 ceph-mon[74913]: pgmap v891: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 75 op/s
Oct 10 10:15:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:40.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:41 compute-2 sudo[242435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:15:41 compute-2 sudo[242435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:41 compute-2 sudo[242435]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:41 compute-2 sudo[242460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Oct 10 10:15:41 compute-2 sudo[242460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:41 compute-2 sudo[242460]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:15:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:15:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:15:41 compute-2 sudo[242505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:15:41 compute-2 sudo[242505]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:41 compute-2 sudo[242505]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:41 compute-2 sudo[242530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:15:41 compute-2 sudo[242530]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:42 compute-2 sudo[242530]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:42.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:42 compute-2 ceph-mon[74913]: pgmap v892: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 69 op/s
Oct 10 10:15:42 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:42 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:42 compute-2 nova_compute[235775]: 2025-10-10 10:15:42.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:44.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:44 compute-2 ceph-mon[74913]: pgmap v893: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 132 op/s
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:15:44 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:15:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:45 compute-2 nova_compute[235775]: 2025-10-10 10:15:45.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:46.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:46 compute-2 ceph-mon[74913]: pgmap v894: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:46.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:15:47 compute-2 nova_compute[235775]: 2025-10-10 10:15:47.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:48.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:48 compute-2 ceph-mon[74913]: pgmap v895: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:15:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:48 compute-2 nova_compute[235775]: 2025-10-10 10:15:48.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:48 compute-2 nova_compute[235775]: 2025-10-10 10:15:48.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 10 10:15:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:48.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:49 compute-2 sudo[242596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:15:49 compute-2 sudo[242596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:49 compute-2 sudo[242596]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:15:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s
                                           Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 10:15:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:49 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:15:49 compute-2 ceph-mon[74913]: pgmap v896: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:15:50 compute-2 nova_compute[235775]: 2025-10-10 10:15:50.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:50.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:50 compute-2 nova_compute[235775]: 2025-10-10 10:15:50.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:50.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:51 compute-2 nova_compute[235775]: 2025-10-10 10:15:51.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:52 compute-2 ceph-mon[74913]: pgmap v897: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:15:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:52.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:52.482 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:15:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:52.482 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.827 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 10 10:15:52 compute-2 nova_compute[235775]: 2025-10-10 10:15:52.846 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 10 10:15:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:52.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.833 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:15:53 compute-2 nova_compute[235775]: 2025-10-10 10:15:53.860 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:15:54 compute-2 ceph-mon[74913]: pgmap v898: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:15:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:54.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:15:54 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2472353382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.340 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:15:54 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:15:54.484 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.518 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.519 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4913MB free_disk=59.9427490234375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.519 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.520 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.675 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.676 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:15:54 compute-2 nova_compute[235775]: 2025-10-10 10:15:54.730 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:15:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:54.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2472353382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:15:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/375310620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.166 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.171 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.191 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.192 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:15:55 compute-2 nova_compute[235775]: 2025-10-10 10:15:55.192 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:15:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:15:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:15:56 compute-2 ceph-mon[74913]: pgmap v899: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 12 KiB/s wr, 1 op/s
Oct 10 10:15:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/375310620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3100558740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:56.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.174 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.191 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:56 compute-2 nova_compute[235775]: 2025-10-10 10:15:56.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:56.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/422541512' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:15:57 compute-2 nova_compute[235775]: 2025-10-10 10:15:57.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:15:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:57 compute-2 nova_compute[235775]: 2025-10-10 10:15:57.829 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:15:58 compute-2 ceph-mon[74913]: pgmap v900: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 12 KiB/s wr, 1 op/s
Oct 10 10:15:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:15:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:58.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:15:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:15:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:15:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:58.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:15:59 compute-2 sudo[242675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:15:59 compute-2 sudo[242675]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:15:59 compute-2 sudo[242675]: pam_unix(sudo:session): session closed for user root
Oct 10 10:15:59 compute-2 podman[242701]: 2025-10-10 10:15:59.063254132 +0000 UTC m=+0.048131839 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 10 10:15:59 compute-2 podman[242699]: 2025-10-10 10:15:59.063237252 +0000 UTC m=+0.053282515 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 10 10:15:59 compute-2 podman[242700]: 2025-10-10 10:15:59.087621446 +0000 UTC m=+0.075379666 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller)
Oct 10 10:15:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:15:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:00 compute-2 nova_compute[235775]: 2025-10-10 10:16:00.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:00 compute-2 ceph-mon[74913]: pgmap v901: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 14 KiB/s wr, 2 op/s
Oct 10 10:16:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:00.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:02 compute-2 ceph-mon[74913]: pgmap v902: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 2.0 KiB/s wr, 1 op/s
Oct 10 10:16:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:16:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/924314656' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:02 compute-2 nova_compute[235775]: 2025-10-10 10:16:02.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:02.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/369795556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:04.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:04 compute-2 ceph-mon[74913]: pgmap v903: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 7.3 KiB/s wr, 2 op/s
Oct 10 10:16:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:04.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:05 compute-2 nova_compute[235775]: 2025-10-10 10:16:05.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:06.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:06 compute-2 ceph-mon[74913]: pgmap v904: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 7.3 KiB/s wr, 1 op/s
Oct 10 10:16:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1563002844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:06.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:07 compute-2 nova_compute[235775]: 2025-10-10 10:16:07.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:08.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:08 compute-2 ceph-mon[74913]: pgmap v905: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 7.3 KiB/s wr, 1 op/s
Oct 10 10:16:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:08.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:09 compute-2 podman[242773]: 2025-10-10 10:16:09.815246882 +0000 UTC m=+0.082780313 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 10:16:10 compute-2 nova_compute[235775]: 2025-10-10 10:16:10.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:10.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:10 compute-2 ceph-mon[74913]: pgmap v906: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Oct 10 10:16:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1304454409' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:16:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2889548273' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:16:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:12.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:12 compute-2 ceph-mon[74913]: pgmap v907: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:16:12 compute-2 nova_compute[235775]: 2025-10-10 10:16:12.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:12.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:14.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:14 compute-2 ceph-mon[74913]: pgmap v908: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 1.8 MiB/s wr, 39 op/s
Oct 10 10:16:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:14.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:15 compute-2 nova_compute[235775]: 2025-10-10 10:16:15.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:16.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:16 compute-2 ceph-mon[74913]: pgmap v909: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:16.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:16:17 compute-2 nova_compute[235775]: 2025-10-10 10:16:17.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:18.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:18 compute-2 ceph-mon[74913]: pgmap v910: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Oct 10 10:16:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:18.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:19 compute-2 sudo[242803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:16:19 compute-2 sudo[242803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:19 compute-2 sudo[242803]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:20 compute-2 nova_compute[235775]: 2025-10-10 10:16:20.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:20.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:20 compute-2 ceph-mon[74913]: pgmap v911: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Oct 10 10:16:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:20.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:22.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:22 compute-2 ceph-mon[74913]: pgmap v912: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 17 KiB/s wr, 75 op/s
Oct 10 10:16:22 compute-2 nova_compute[235775]: 2025-10-10 10:16:22.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:22.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:23 compute-2 ceph-mon[74913]: pgmap v913: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 19 KiB/s wr, 77 op/s
Oct 10 10:16:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:24.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:24 compute-2 nova_compute[235775]: 2025-10-10 10:16:24.903 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:24.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:25 compute-2 nova_compute[235775]: 2025-10-10 10:16:25.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:25 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 10:16:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:26 compute-2 ceph-mon[74913]: pgmap v914: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 3.4 KiB/s wr, 66 op/s
Oct 10 10:16:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:26.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:26.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1213483499' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:16:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1213483499' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:16:27 compute-2 nova_compute[235775]: 2025-10-10 10:16:27.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:28 compute-2 ceph-mon[74913]: pgmap v915: 353 pgs: 353 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 3.4 KiB/s wr, 66 op/s
Oct 10 10:16:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:28.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:28.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:29 compute-2 podman[242843]: 2025-10-10 10:16:29.814878687 +0000 UTC m=+0.087781344 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:16:29 compute-2 podman[242844]: 2025-10-10 10:16:29.820780557 +0000 UTC m=+0.091114952 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 10:16:29 compute-2 podman[242842]: 2025-10-10 10:16:29.822662397 +0000 UTC m=+0.089900892 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:16:30 compute-2 ceph-mon[74913]: pgmap v916: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 127 op/s
Oct 10 10:16:30 compute-2 nova_compute[235775]: 2025-10-10 10:16:30.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:30.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:32 compute-2 ceph-mon[74913]: pgmap v917: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Oct 10 10:16:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:16:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:32.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:32 compute-2 nova_compute[235775]: 2025-10-10 10:16:32.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:32.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:34 compute-2 ceph-mon[74913]: pgmap v918: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 288 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:16:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:34.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:34.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:35 compute-2 nova_compute[235775]: 2025-10-10 10:16:35.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:36 compute-2 ceph-mon[74913]: pgmap v919: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 284 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Oct 10 10:16:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:36.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:36.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:37 compute-2 nova_compute[235775]: 2025-10-10 10:16:37.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:38 compute-2 ceph-mon[74913]: pgmap v920: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 284 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Oct 10 10:16:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:38.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:39 compute-2 sudo[242917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:16:39 compute-2 sudo[242917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:39 compute-2 sudo[242917]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:40 compute-2 nova_compute[235775]: 2025-10-10 10:16:40.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:40 compute-2 ceph-mon[74913]: pgmap v921: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 285 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Oct 10 10:16:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:40.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:40 compute-2 podman[242943]: 2025-10-10 10:16:40.764097471 +0000 UTC m=+0.043709988 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 10:16:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:40.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:16:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:16:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:42 compute-2 ceph-mon[74913]: pgmap v922: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 15 KiB/s wr, 1 op/s
Oct 10 10:16:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:42.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:42 compute-2 nova_compute[235775]: 2025-10-10 10:16:42.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:42.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:44 compute-2 ceph-mon[74913]: pgmap v923: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 25 KiB/s wr, 3 op/s
Oct 10 10:16:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:44.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:44.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:45 compute-2 nova_compute[235775]: 2025-10-10 10:16:45.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:46 compute-2 ceph-mon[74913]: pgmap v924: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 12 KiB/s wr, 2 op/s
Oct 10 10:16:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:46.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:46.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:16:47 compute-2 nova_compute[235775]: 2025-10-10 10:16:47.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:48.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:48 compute-2 ceph-mon[74913]: pgmap v925: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 12 KiB/s wr, 2 op/s
Oct 10 10:16:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:48.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:49 compute-2 sudo[242972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:16:49 compute-2 sudo[242972]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:49 compute-2 sudo[242972]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:49 compute-2 sudo[242997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:16:49 compute-2 sudo[242997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:49 compute-2 sudo[242997]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:50 compute-2 sudo[243054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:16:50 compute-2 sudo[243054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:50 compute-2 sudo[243054]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:50 compute-2 sudo[243079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4 -- inventory --format=json-pretty --filter-for-batch
Oct 10 10:16:50 compute-2 sudo[243079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:50 compute-2 nova_compute[235775]: 2025-10-10 10:16:50.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:16:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:50.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:16:50 compute-2 ceph-mon[74913]: pgmap v926: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 15 KiB/s wr, 3 op/s
Oct 10 10:16:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.545372971 +0000 UTC m=+0.051069324 container create 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2)
Oct 10 10:16:50 compute-2 systemd[1]: Started libpod-conmon-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope.
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.521348158 +0000 UTC m=+0.027044541 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:16:50 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.644960004 +0000 UTC m=+0.150656367 container init 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.654352586 +0000 UTC m=+0.160048969 container start 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.658702306 +0000 UTC m=+0.164398689 container attach 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Oct 10 10:16:50 compute-2 systemd[1]: libpod-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope: Deactivated successfully.
Oct 10 10:16:50 compute-2 strange_bell[243162]: 167 167
Oct 10 10:16:50 compute-2 conmon[243162]: conmon 318485f3a8d84ac65abf <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope/container/memory.events
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.661901039 +0000 UTC m=+0.167597392 container died 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 10:16:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-713019a75dc8b0996e1ae5efb45190fbb56ebf3b4a978f04bdc48e490de56cc9-merged.mount: Deactivated successfully.
Oct 10 10:16:50 compute-2 podman[243144]: 2025-10-10 10:16:50.698793295 +0000 UTC m=+0.204489678 container remove 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 10:16:50 compute-2 systemd[1]: libpod-conmon-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope: Deactivated successfully.
Oct 10 10:16:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:50 compute-2 nova_compute[235775]: 2025-10-10 10:16:50.831 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:50 compute-2 podman[243185]: 2025-10-10 10:16:50.845008208 +0000 UTC m=+0.037745185 container create e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Oct 10 10:16:50 compute-2 systemd[1]: Started libpod-conmon-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope.
Oct 10 10:16:50 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:16:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 10:16:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 10:16:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 10:16:50 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 10:16:50 compute-2 podman[243185]: 2025-10-10 10:16:50.925060673 +0000 UTC m=+0.117797670 container init e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Oct 10 10:16:50 compute-2 podman[243185]: 2025-10-10 10:16:50.829271682 +0000 UTC m=+0.022008669 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 10:16:50 compute-2 podman[243185]: 2025-10-10 10:16:50.932815352 +0000 UTC m=+0.125552339 container start e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 10:16:50 compute-2 podman[243185]: 2025-10-10 10:16:50.936440049 +0000 UTC m=+0.129177046 container attach e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 10:16:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:50.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:51 compute-2 sshd-session[242834]: error: kex_exchange_identification: read: Connection reset by peer
Oct 10 10:16:51 compute-2 sshd-session[242834]: Connection reset by 45.140.17.97 port 25294
Oct 10 10:16:51 compute-2 pensive_brown[243202]: [
Oct 10 10:16:51 compute-2 pensive_brown[243202]:     {
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "available": false,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "being_replaced": false,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "ceph_device_lvm": false,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "lsm_data": {},
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "lvs": [],
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "path": "/dev/sr0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "rejected_reasons": [
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "Insufficient space (<5GB)",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "Has a FileSystem"
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         ],
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         "sys_api": {
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "actuators": null,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "device_nodes": [
Oct 10 10:16:51 compute-2 pensive_brown[243202]:                 "sr0"
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             ],
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "devname": "sr0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "human_readable_size": "482.00 KB",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "id_bus": "ata",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "model": "QEMU DVD-ROM",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "nr_requests": "2",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "parent": "/dev/sr0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "partitions": {},
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "path": "/dev/sr0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "removable": "1",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "rev": "2.5+",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "ro": "0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "rotational": "0",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "sas_address": "",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "sas_device_handle": "",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "scheduler_mode": "mq-deadline",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "sectors": 0,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "sectorsize": "2048",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "size": 493568.0,
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "support_discard": "2048",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "type": "disk",
Oct 10 10:16:51 compute-2 pensive_brown[243202]:             "vendor": "QEMU"
Oct 10 10:16:51 compute-2 pensive_brown[243202]:         }
Oct 10 10:16:51 compute-2 pensive_brown[243202]:     }
Oct 10 10:16:51 compute-2 pensive_brown[243202]: ]
Oct 10 10:16:51 compute-2 systemd[1]: libpod-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope: Deactivated successfully.
Oct 10 10:16:51 compute-2 conmon[243202]: conmon e710ce9fc70a0fc4e419 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope/container/memory.events
Oct 10 10:16:51 compute-2 podman[243185]: 2025-10-10 10:16:51.698137668 +0000 UTC m=+0.890874675 container died e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid)
Oct 10 10:16:51 compute-2 systemd[1]: var-lib-containers-storage-overlay-43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249-merged.mount: Deactivated successfully.
Oct 10 10:16:51 compute-2 podman[243185]: 2025-10-10 10:16:51.738725363 +0000 UTC m=+0.931462340 container remove e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 10:16:51 compute-2 systemd[1]: libpod-conmon-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope: Deactivated successfully.
Oct 10 10:16:51 compute-2 sudo[243079]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:52.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:52 compute-2 ceph-mon[74913]: pgmap v927: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 13 KiB/s wr, 2 op/s
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:16:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:16:52 compute-2 nova_compute[235775]: 2025-10-10 10:16:52.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:52 compute-2 nova_compute[235775]: 2025-10-10 10:16:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:52 compute-2 nova_compute[235775]: 2025-10-10 10:16:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:52.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:53 compute-2 nova_compute[235775]: 2025-10-10 10:16:53.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:53 compute-2 nova_compute[235775]: 2025-10-10 10:16:53.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:16:53 compute-2 nova_compute[235775]: 2025-10-10 10:16:53.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:16:53 compute-2 nova_compute[235775]: 2025-10-10 10:16:53.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:16:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:54.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:54 compute-2 ceph-mon[74913]: pgmap v928: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 14 KiB/s wr, 3 op/s
Oct 10 10:16:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.836 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.838 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:16:54 compute-2 nova_compute[235775]: 2025-10-10 10:16:54.838 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:16:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:16:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3908480547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.285 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:16:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3908480547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.467 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.468 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4861MB free_disk=59.89700698852539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.468 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.469 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.543 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.544 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:16:55 compute-2 nova_compute[235775]: 2025-10-10 10:16:55.595 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:16:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:16:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:16:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2955662946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:56 compute-2 nova_compute[235775]: 2025-10-10 10:16:56.024 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:16:56 compute-2 nova_compute[235775]: 2025-10-10 10:16:56.029 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:16:56 compute-2 nova_compute[235775]: 2025-10-10 10:16:56.048 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:16:56 compute-2 nova_compute[235775]: 2025-10-10 10:16:56.050 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:16:56 compute-2 nova_compute[235775]: 2025-10-10 10:16:56.050 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:16:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:56.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:56 compute-2 ceph-mon[74913]: pgmap v929: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 4.3 KiB/s wr, 1 op/s
Oct 10 10:16:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2955662946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:56.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:57 compute-2 sudo[244440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:16:57 compute-2 sudo[244440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:57 compute-2 sudo[244440]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:57 compute-2 nova_compute[235775]: 2025-10-10 10:16:57.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:16:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:16:57 compute-2 ceph-mon[74913]: pgmap v930: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 4.3 KiB/s wr, 1 op/s
Oct 10 10:16:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:58 compute-2 nova_compute[235775]: 2025-10-10 10:16:58.051 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:58 compute-2 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:58 compute-2 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:16:58 compute-2 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:16:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:16:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:58.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:16:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2386263607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:16:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:16:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:16:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:58.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:16:59 compute-2 sudo[244467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:16:59 compute-2 sudo[244467]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:16:59 compute-2 sudo[244467]: pam_unix(sudo:session): session closed for user root
Oct 10 10:16:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:16:59 compute-2 ceph-mon[74913]: pgmap v931: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 7.7 KiB/s wr, 2 op/s
Oct 10 10:16:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/854478764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:00 compute-2 nova_compute[235775]: 2025-10-10 10:17:00.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:00.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:00 compute-2 podman[244493]: 2025-10-10 10:17:00.801010638 +0000 UTC m=+0.065507417 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:17:00 compute-2 podman[244495]: 2025-10-10 10:17:00.802100944 +0000 UTC m=+0.062138460 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true)
Oct 10 10:17:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:00 compute-2 podman[244494]: 2025-10-10 10:17:00.843849626 +0000 UTC m=+0.108359155 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:17:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:00.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:02 compute-2 ceph-mon[74913]: pgmap v932: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 4.3 KiB/s wr, 1 op/s
Oct 10 10:17:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:17:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/867703427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:02.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:02 compute-2 nova_compute[235775]: 2025-10-10 10:17:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:02.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1641573569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:04 compute-2 ceph-mon[74913]: pgmap v933: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 4.3 KiB/s wr, 2 op/s
Oct 10 10:17:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Oct 10 10:17:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:17:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:04.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.392683) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424392735, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1683, "num_deletes": 257, "total_data_size": 4233461, "memory_usage": 4297424, "flush_reason": "Manual Compaction"}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424410888, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2743362, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28472, "largest_seqno": 30149, "table_properties": {"data_size": 2736402, "index_size": 3967, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14641, "raw_average_key_size": 19, "raw_value_size": 2722384, "raw_average_value_size": 3634, "num_data_blocks": 174, "num_entries": 749, "num_filter_entries": 749, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091287, "oldest_key_time": 1760091287, "file_creation_time": 1760091424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 18270 microseconds, and 10013 cpu microseconds.
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.410949) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2743362 bytes OK
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.410978) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412216) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412238) EVENT_LOG_v1 {"time_micros": 1760091424412231, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412260) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4225731, prev total WAL file size 4225731, number of live WAL files 2.
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.414168) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373535' seq:0, type:0; will stop at (end)
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2679KB)], [54(13MB)]
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424414231, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17069281, "oldest_snapshot_seqno": -1}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6030 keys, 16925177 bytes, temperature: kUnknown
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424505979, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16925177, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16881925, "index_size": 27078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153413, "raw_average_key_size": 25, "raw_value_size": 16770315, "raw_average_value_size": 2781, "num_data_blocks": 1111, "num_entries": 6030, "num_filter_entries": 6030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.506376) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16925177 bytes
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.508474) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.7 rd, 184.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.7 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(12.4) write-amplify(6.2) OK, records in: 6562, records dropped: 532 output_compression: NoCompression
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.508489) EVENT_LOG_v1 {"time_micros": 1760091424508482, "job": 32, "event": "compaction_finished", "compaction_time_micros": 91901, "compaction_time_cpu_micros": 52781, "output_level": 6, "num_output_files": 1, "total_output_size": 16925177, "num_input_records": 6562, "num_output_records": 6030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424509159, "job": 32, "event": "table_file_deletion", "file_number": 56}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424511566, "job": 32, "event": "table_file_deletion", "file_number": 54}
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.413897) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:17:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:04.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:05 compute-2 nova_compute[235775]: 2025-10-10 10:17:05.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:05 compute-2 ceph-mon[74913]: pgmap v934: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 3.3 KiB/s wr, 1 op/s
Oct 10 10:17:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:06.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:06.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:07 compute-2 nova_compute[235775]: 2025-10-10 10:17:07.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:08 compute-2 ceph-mon[74913]: pgmap v935: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 3.3 KiB/s wr, 1 op/s
Oct 10 10:17:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:08.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:10 compute-2 ceph-mon[74913]: pgmap v936: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 108 KiB/s rd, 5.3 KiB/s wr, 179 op/s
Oct 10 10:17:10 compute-2 nova_compute[235775]: 2025-10-10 10:17:10.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:10.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:11 compute-2 podman[244571]: 2025-10-10 10:17:11.798788044 +0000 UTC m=+0.066211480 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:12 compute-2 ceph-mon[74913]: pgmap v937: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 107 KiB/s rd, 2.0 KiB/s wr, 178 op/s
Oct 10 10:17:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:12.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:12 compute-2 nova_compute[235775]: 2025-10-10 10:17:12.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:13 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:13.207 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:17:13 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:13.208 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:17:13 compute-2 nova_compute[235775]: 2025-10-10 10:17:13.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:14 compute-2 ceph-mon[74913]: pgmap v938: 353 pgs: 353 active+clean; 121 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 127 KiB/s rd, 4.2 KiB/s wr, 207 op/s
Oct 10 10:17:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2478587300' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:14.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:15.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:15 compute-2 nova_compute[235775]: 2025-10-10 10:17:15.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:16 compute-2 ceph-mon[74913]: pgmap v939: 353 pgs: 353 active+clean; 121 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 126 KiB/s rd, 4.2 KiB/s wr, 206 op/s
Oct 10 10:17:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:16.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:17.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:17:17 compute-2 nova_compute[235775]: 2025-10-10 10:17:17.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:18 compute-2 ceph-mon[74913]: pgmap v940: 353 pgs: 353 active+clean; 121 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 126 KiB/s rd, 4.2 KiB/s wr, 206 op/s
Oct 10 10:17:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:18.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:17:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:17:19 compute-2 sudo[244598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:17:19 compute-2 sudo[244598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:17:19 compute-2 sudo[244598]: pam_unix(sudo:session): session closed for user root
Oct 10 10:17:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:20 compute-2 nova_compute[235775]: 2025-10-10 10:17:20.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:20 compute-2 ceph-mon[74913]: pgmap v941: 353 pgs: 353 active+clean; 121 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 127 KiB/s rd, 5.2 KiB/s wr, 207 op/s
Oct 10 10:17:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:17:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:20.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:17:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:21.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3931265671' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:22 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:22.210 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:17:22 compute-2 ceph-mon[74913]: pgmap v942: 353 pgs: 353 active+clean; 121 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 29 op/s
Oct 10 10:17:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:22.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:22 compute-2 nova_compute[235775]: 2025-10-10 10:17:22.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:24 compute-2 ceph-mon[74913]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 6.0 KiB/s wr, 57 op/s
Oct 10 10:17:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:24.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:25 compute-2 nova_compute[235775]: 2025-10-10 10:17:25.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:26.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:26 compute-2 ceph-mon[74913]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.8 KiB/s wr, 29 op/s
Oct 10 10:17:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:17:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:17:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:17:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:27.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:17:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:17:27 compute-2 nova_compute[235775]: 2025-10-10 10:17:27.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:28.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:28 compute-2 ceph-mon[74913]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.8 KiB/s wr, 29 op/s
Oct 10 10:17:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:29.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:30 compute-2 nova_compute[235775]: 2025-10-10 10:17:30.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:30.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:30 compute-2 ceph-mon[74913]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.8 KiB/s wr, 29 op/s
Oct 10 10:17:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:31.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:17:31 compute-2 podman[244637]: 2025-10-10 10:17:31.814513898 +0000 UTC m=+0.068107612 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:17:31 compute-2 podman[244635]: 2025-10-10 10:17:31.814493857 +0000 UTC m=+0.078170256 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:31 compute-2 podman[244636]: 2025-10-10 10:17:31.85374623 +0000 UTC m=+0.111379774 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:17:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:32.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:32 compute-2 ceph-mon[74913]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.8 KiB/s wr, 29 op/s
Oct 10 10:17:32 compute-2 nova_compute[235775]: 2025-10-10 10:17:32.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:33.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:33 compute-2 ceph-mon[74913]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.8 KiB/s wr, 29 op/s
Oct 10 10:17:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:17:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:34.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:17:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:35.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:35 compute-2 nova_compute[235775]: 2025-10-10 10:17:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:36 compute-2 ceph-mon[74913]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:17:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:36.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:37.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:37 compute-2 nova_compute[235775]: 2025-10-10 10:17:37.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:38 compute-2 ceph-mon[74913]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:17:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:39.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:39 compute-2 sudo[244705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:17:39 compute-2 sudo[244705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:17:39 compute-2 sudo[244705]: pam_unix(sudo:session): session closed for user root
Oct 10 10:17:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:40 compute-2 ceph-mon[74913]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:17:40 compute-2 nova_compute[235775]: 2025-10-10 10:17:40.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:40.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:41.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.471 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:17:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.472 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:17:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.472 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:42 compute-2 ceph-mon[74913]: pgmap v952: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:17:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:42.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:42 compute-2 podman[244733]: 2025-10-10 10:17:42.768615829 +0000 UTC m=+0.050320351 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:17:42 compute-2 nova_compute[235775]: 2025-10-10 10:17:42.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:43.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:44 compute-2 ceph-mon[74913]: pgmap v953: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:17:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:44.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:45.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:45 compute-2 nova_compute[235775]: 2025-10-10 10:17:45.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:46 compute-2 ceph-mon[74913]: pgmap v954: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:17:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/710863530' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:46.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:47.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:17:47 compute-2 nova_compute[235775]: 2025-10-10 10:17:47.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:48 compute-2 ceph-mon[74913]: pgmap v955: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:17:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:48.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:49.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3477895027' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:17:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1219895572' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:17:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:50 compute-2 ceph-mon[74913]: pgmap v956: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:17:50 compute-2 nova_compute[235775]: 2025-10-10 10:17:50.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:50.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:50 compute-2 nova_compute[235775]: 2025-10-10 10:17:50.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:51.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:52 compute-2 ceph-mon[74913]: pgmap v957: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:17:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:52.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:52 compute-2 nova_compute[235775]: 2025-10-10 10:17:52.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:52 compute-2 nova_compute[235775]: 2025-10-10 10:17:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:53.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:53 compute-2 nova_compute[235775]: 2025-10-10 10:17:53.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:54 compute-2 ceph-mon[74913]: pgmap v958: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 97 op/s
Oct 10 10:17:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:54.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:17:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.853 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.855 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.896 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:17:54 compute-2 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:17:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:55.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:17:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:17:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1465875043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.425 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.560 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4948MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.624 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.625 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:17:55 compute-2 nova_compute[235775]: 2025-10-10 10:17:55.644 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:17:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:17:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:17:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3077656643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:56 compute-2 nova_compute[235775]: 2025-10-10 10:17:56.088 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:17:56 compute-2 nova_compute[235775]: 2025-10-10 10:17:56.096 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:17:56 compute-2 nova_compute[235775]: 2025-10-10 10:17:56.116 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:17:56 compute-2 nova_compute[235775]: 2025-10-10 10:17:56.118 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:17:56 compute-2 nova_compute[235775]: 2025-10-10 10:17:56.119 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:17:56 compute-2 ceph-mon[74913]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 97 op/s
Oct 10 10:17:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1465875043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3077656643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:17:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:56.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:17:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:17:57 compute-2 nova_compute[235775]: 2025-10-10 10:17:57.078 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:57 compute-2 sudo[244811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:17:57 compute-2 sudo[244811]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:17:57 compute-2 sudo[244811]: pam_unix(sudo:session): session closed for user root
Oct 10 10:17:57 compute-2 sudo[244836]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:17:57 compute-2 sudo[244836]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:17:57 compute-2 sudo[244836]: pam_unix(sudo:session): session closed for user root
Oct 10 10:17:57 compute-2 nova_compute[235775]: 2025-10-10 10:17:57.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:17:57 compute-2 nova_compute[235775]: 2025-10-10 10:17:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:57 compute-2 nova_compute[235775]: 2025-10-10 10:17:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:57 compute-2 nova_compute[235775]: 2025-10-10 10:17:57.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:17:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:58 compute-2 ceph-mon[74913]: pgmap v960: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 97 op/s
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:17:58 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:17:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:58.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:17:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:17:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:59.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:17:59 compute-2 sudo[244894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:17:59 compute-2 sudo[244894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:17:59 compute-2 sudo[244894]: pam_unix(sudo:session): session closed for user root
Oct 10 10:17:59 compute-2 nova_compute[235775]: 2025-10-10 10:17:59.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:17:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:17:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:00 compute-2 nova_compute[235775]: 2025-10-10 10:18:00.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:00 compute-2 ceph-mon[74913]: pgmap v961: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 128 op/s
Oct 10 10:18:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/872784269' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3277334980' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.003000098s ======
Oct 10 10:18:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:00.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000098s
Oct 10 10:18:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:00 compute-2 nova_compute[235775]: 2025-10-10 10:18:00.811 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:01.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3477850028' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:02.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:02 compute-2 ceph-mon[74913]: pgmap v962: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Oct 10 10:18:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:18:02 compute-2 podman[244922]: 2025-10-10 10:18:02.79860728 +0000 UTC m=+0.064489365 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 10:18:02 compute-2 nova_compute[235775]: 2025-10-10 10:18:02.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:02 compute-2 podman[244924]: 2025-10-10 10:18:02.816115704 +0000 UTC m=+0.066094748 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct 10 10:18:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:02 compute-2 podman[244923]: 2025-10-10 10:18:02.838154293 +0000 UTC m=+0.096849416 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 10:18:02 compute-2 sudo[244990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:18:02 compute-2 sudo[244990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:18:02 compute-2 sudo[244990]: pam_unix(sudo:session): session closed for user root
Oct 10 10:18:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:03.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3550810255' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:18:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:18:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:04 compute-2 ceph-mon[74913]: pgmap v963: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Oct 10 10:18:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3370266523' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:05.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:05 compute-2 nova_compute[235775]: 2025-10-10 10:18:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:05 compute-2 ceph-mon[74913]: pgmap v964: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 156 KiB/s rd, 1.2 KiB/s wr, 31 op/s
Oct 10 10:18:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:07.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:07 compute-2 nova_compute[235775]: 2025-10-10 10:18:07.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.115 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.116 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:08 compute-2 ceph-mon[74913]: pgmap v965: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 156 KiB/s rd, 1.2 KiB/s wr, 31 op/s
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.140 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.304 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.305 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.312 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.312 2 INFO nova.compute.claims [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Claim successful on node compute-2.ctlplane.example.com
Oct 10 10:18:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:08.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.436 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:18:08 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2863660861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.875 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.880 2 DEBUG nova.compute.provider_tree [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.899 2 DEBUG nova.scheduler.client.report [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.925 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.925 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.984 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 10 10:18:08 compute-2 nova_compute[235775]: 2025-10-10 10:18:08.984 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.022 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.043 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 10 10:18:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:09.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2863660861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.187 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.189 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.189 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating image(s)
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.220 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.250 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.277 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.280 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.351 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.352 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.353 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.353 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.375 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.378 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.596 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.670 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.824 2 DEBUG nova.objects.instance [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:18:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.845 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.845 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Ensure instance console log exists: /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.846 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.846 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:09 compute-2 nova_compute[235775]: 2025-10-10 10:18:09.847 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:10 compute-2 nova_compute[235775]: 2025-10-10 10:18:10.054 2 DEBUG nova.policy [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 10 10:18:10 compute-2 ceph-mon[74913]: pgmap v966: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 156 KiB/s rd, 1.2 KiB/s wr, 32 op/s
Oct 10 10:18:10 compute-2 nova_compute[235775]: 2025-10-10 10:18:10.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:10.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:11.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.166559) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491167119, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1184, "num_deletes": 501, "total_data_size": 1952418, "memory_usage": 1978488, "flush_reason": "Manual Compaction"}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491174715, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 897365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30154, "largest_seqno": 31333, "table_properties": {"data_size": 893126, "index_size": 1379, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13960, "raw_average_key_size": 19, "raw_value_size": 882191, "raw_average_value_size": 1232, "num_data_blocks": 61, "num_entries": 716, "num_filter_entries": 716, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091425, "oldest_key_time": 1760091425, "file_creation_time": 1760091491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 8203 microseconds, and 3455 cpu microseconds.
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.174768) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 897365 bytes OK
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.174791) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177100) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177124) EVENT_LOG_v1 {"time_micros": 1760091491177117, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177145) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1945717, prev total WAL file size 1945717, number of live WAL files 2.
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.178146) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(876KB)], [57(16MB)]
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491178179, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 17822542, "oldest_snapshot_seqno": -1}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5753 keys, 12048656 bytes, temperature: kUnknown
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491245991, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12048656, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12012624, "index_size": 20562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14405, "raw_key_size": 148782, "raw_average_key_size": 25, "raw_value_size": 11911109, "raw_average_value_size": 2070, "num_data_blocks": 824, "num_entries": 5753, "num_filter_entries": 5753, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.246220) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12048656 bytes
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.247957) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 262.6 rd, 177.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.1 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(33.3) write-amplify(13.4) OK, records in: 6746, records dropped: 993 output_compression: NoCompression
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.247974) EVENT_LOG_v1 {"time_micros": 1760091491247966, "job": 34, "event": "compaction_finished", "compaction_time_micros": 67874, "compaction_time_cpu_micros": 35310, "output_level": 6, "num_output_files": 1, "total_output_size": 12048656, "num_input_records": 6746, "num_output_records": 5753, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491248239, "job": 34, "event": "table_file_deletion", "file_number": 59}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491251515, "job": 34, "event": "table_file_deletion", "file_number": 57}
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.178067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.860 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Successfully updated port: 864e1646-5abd-4268-a80a-c224425c842d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.949 2 DEBUG nova.compute.manager [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-changed-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.950 2 DEBUG nova.compute.manager [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Refreshing instance network info cache due to event network-changed-864e1646-5abd-4268-a80a-c224425c842d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:18:11 compute-2 nova_compute[235775]: 2025-10-10 10:18:11.950 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:18:12 compute-2 nova_compute[235775]: 2025-10-10 10:18:12.057 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 10 10:18:12 compute-2 ceph-mon[74913]: pgmap v967: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:18:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:12.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:12 compute-2 nova_compute[235775]: 2025-10-10 10:18:12.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:13.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.533 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.576 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.576 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance network_info: |[{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.577 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.577 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Refreshing network info cache for port 864e1646-5abd-4268-a80a-c224425c842d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.579 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start _get_guest_xml network_info=[{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.583 2 WARNING nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.587 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.588 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.591 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 10 10:18:13 compute-2 nova_compute[235775]: 2025-10-10 10:18:13.597 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:13 compute-2 podman[245214]: 2025-10-10 10:18:13.79389691 +0000 UTC m=+0.065499978 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 10 10:18:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:18:14 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2249811561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.071 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.094 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.098 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:14 compute-2 ceph-mon[74913]: pgmap v968: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2249811561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:14.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:18:14 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1035205217' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.612 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.613 2 DEBUG nova.virt.libvirt.vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:18:09Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.614 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.615 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.615 2 DEBUG nova.objects.instance [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.637 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] End _get_guest_xml xml=<domain type="kvm">
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <uuid>49a68fc9-f469-4827-9bb8-f2c2981d2b68</uuid>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <name>instance-00000009</name>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <memory>131072</memory>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <vcpu>1</vcpu>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <metadata>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:name>tempest-TestNetworkBasicOps-server-681617856</nova:name>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:creationTime>2025-10-10 10:18:13</nova:creationTime>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:flavor name="m1.nano">
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:memory>128</nova:memory>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:disk>1</nova:disk>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:swap>0</nova:swap>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:ephemeral>0</nova:ephemeral>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:vcpus>1</nova:vcpus>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </nova:flavor>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:owner>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </nova:owner>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <nova:ports>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <nova:port uuid="864e1646-5abd-4268-a80a-c224425c842d">
Oct 10 10:18:14 compute-2 nova_compute[235775]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         </nova:port>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </nova:ports>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </nova:instance>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </metadata>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <sysinfo type="smbios">
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <system>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="manufacturer">RDO</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="product">OpenStack Compute</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="serial">49a68fc9-f469-4827-9bb8-f2c2981d2b68</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="uuid">49a68fc9-f469-4827-9bb8-f2c2981d2b68</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <entry name="family">Virtual Machine</entry>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </system>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </sysinfo>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <os>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <boot dev="hd"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <smbios mode="sysinfo"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </os>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <features>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <acpi/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <apic/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <vmcoreinfo/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </features>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <clock offset="utc">
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <timer name="pit" tickpolicy="delay"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <timer name="hpet" present="no"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </clock>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <cpu mode="host-model" match="exact">
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <topology sockets="1" cores="1" threads="1"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <disk type="network" device="disk">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk">
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </source>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <target dev="vda" bus="virtio"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <disk type="network" device="cdrom">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config">
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </source>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:18:14 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <target dev="sda" bus="sata"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <interface type="ethernet">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <mac address="fa:16:3e:19:de:db"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <driver name="vhost" rx_queue_size="512"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <mtu size="1442"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <target dev="tap864e1646-5a"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <serial type="pty">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <log file="/var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/console.log" append="off"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </serial>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <video>
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </video>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <input type="tablet" bus="usb"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <rng model="virtio">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <backend model="random">/dev/urandom</backend>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <controller type="usb" index="0"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     <memballoon model="virtio">
Oct 10 10:18:14 compute-2 nova_compute[235775]:       <stats period="10"/>
Oct 10 10:18:14 compute-2 nova_compute[235775]:     </memballoon>
Oct 10 10:18:14 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:18:14 compute-2 nova_compute[235775]: </domain>
Oct 10 10:18:14 compute-2 nova_compute[235775]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Preparing to wait for external event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG nova.virt.libvirt.vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:18:09Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.640 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.640 2 DEBUG os_vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap864e1646-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap864e1646-5a, col_values=(('external_ids', {'iface-id': '864e1646-5abd-4268-a80a-c224425c842d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:de:db', 'vm-uuid': '49a68fc9-f469-4827-9bb8-f2c2981d2b68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:14 compute-2 NetworkManager[44866]: <info>  [1760091494.6481] manager: (tap864e1646-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.662 2 INFO os_vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a')
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.724 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.725 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.725 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:19:de:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.725 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Using config drive
Oct 10 10:18:14 compute-2 nova_compute[235775]: 2025-10-10 10:18:14.751 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:15.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1035205217' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.241 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updated VIF entry in instance network info cache for port 864e1646-5abd-4268-a80a-c224425c842d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.242 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.261 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.304 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating config drive at /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.315 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdo0s4rze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.446 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdo0s4rze" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.484 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.490 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.657 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.659 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deleting local config drive /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config because it was imported into RBD.
Oct 10 10:18:15 compute-2 systemd[1]: Starting libvirt secret daemon...
Oct 10 10:18:15 compute-2 systemd[1]: Started libvirt secret daemon.
Oct 10 10:18:15 compute-2 kernel: tap864e1646-5a: entered promiscuous mode
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.7659] manager: (tap864e1646-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Oct 10 10:18:15 compute-2 ovn_controller[132503]: 2025-10-10T10:18:15Z|00038|binding|INFO|Claiming lport 864e1646-5abd-4268-a80a-c224425c842d for this chassis.
Oct 10 10:18:15 compute-2 ovn_controller[132503]: 2025-10-10T10:18:15Z|00039|binding|INFO|864e1646-5abd-4268-a80a-c224425c842d: Claiming fa:16:3e:19:de:db 10.100.0.4
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.7871] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.7877] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.793 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:de:db 10.100.0.4'], port_security=['fa:16:3e:19:de:db 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '49a68fc9-f469-4827-9bb8-f2c2981d2b68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '7', 'neutron:security_group_ids': '79abf760-0fb0-448c-b5c8-75027ac31ae3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58a83406-32bd-40d9-b3dd-ed56e38abb09, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=864e1646-5abd-4268-a80a-c224425c842d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.794 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 864e1646-5abd-4268-a80a-c224425c842d in datapath f2187c16-3ad9-4fc6-892a-d36a6262d4d0 bound to our chassis
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.795 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2187c16-3ad9-4fc6-892a-d36a6262d4d0
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.809 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[640d9617-f1f7-4663-b12f-e28577560800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.810 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2187c16-31 in ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 10 10:18:15 compute-2 systemd-udevd[245387]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:18:15 compute-2 systemd-machined[192768]: New machine qemu-2-instance-00000009.
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.811 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2187c16-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.812 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[12875ac3-de6a-40ed-9ec2-fb7dd8e499d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.812 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[2995be21-5576-4861-b1ba-f3c5c11f802a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.824 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[a86481f6-cf93-446a-9c89-e73b17fba6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.8256] device (tap864e1646-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.8264] device (tap864e1646-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 10:18:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:15 compute-2 systemd[1]: Started Virtual Machine qemu-2-instance-00000009.
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.851 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4e50da17-715c-4d84-9faa-5cc7489ca375]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.878 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[7e732e89-34d4-4ca7-b0bb-37d8c50dd535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.8843] manager: (tapf2187c16-30): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.883 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[cca50fbb-8be7-42ed-ae60-ef59b4965de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 ovn_controller[132503]: 2025-10-10T10:18:15Z|00040|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d ovn-installed in OVS
Oct 10 10:18:15 compute-2 ovn_controller[132503]: 2025-10-10T10:18:15Z|00041|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d up in Southbound
Oct 10 10:18:15 compute-2 nova_compute[235775]: 2025-10-10 10:18:15.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.916 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[c97adbd6-2c95-4346-9b48-267a7c5c743e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.918 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[8397f14d-8146-46f8-b077-24f4e861e559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 NetworkManager[44866]: <info>  [1760091495.9375] device (tapf2187c16-30): carrier: link connected
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.941 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b1fc67-1f65-430b-a869-b003421dc8a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.956 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[75a0f06b-88ea-4534-bfbc-d8ede8097ee9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2187c16-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:33:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439949, 'reachable_time': 25639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245419, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.967 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdaa929-b5d8-4063-9701-ddcbd88d3d00]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3311'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439949, 'tstamp': 439949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245420, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:15 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.984 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[38ade2a9-f218-4c73-802d-beb381f85e0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2187c16-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:33:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439949, 'reachable_time': 25639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245421, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.026 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[aa61a3b6-eca6-48c4-a8f2-0ff80d6ab3e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.101 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[a09de635-44d2-4dbf-9cfa-9f8aad578773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.102 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2187c16-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.103 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.103 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2187c16-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:16 compute-2 kernel: tapf2187c16-30: entered promiscuous mode
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:16 compute-2 NetworkManager[44866]: <info>  [1760091496.1085] manager: (tapf2187c16-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.109 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2187c16-30, col_values=(('external_ids', {'iface-id': 'e9f075b6-37df-4f28-90c0-0fcdd3460568'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:16 compute-2 ovn_controller[132503]: 2025-10-10T10:18:16Z|00042|binding|INFO|Releasing lport e9f075b6-37df-4f28-90c0-0fcdd3460568 from this chassis (sb_readonly=0)
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.134 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.135 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[fe067dea-f1c8-4370-823a-e9e6c9c6ee68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.135 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: global
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     log         /dev/log local0 debug
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     log-tag     haproxy-metadata-proxy-f2187c16-3ad9-4fc6-892a-d36a6262d4d0
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     user        root
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     group       root
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     maxconn     1024
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     pidfile     /var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     daemon
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: defaults
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     log global
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     mode http
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     option httplog
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     option dontlognull
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     option http-server-close
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     option forwardfor
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     retries                 3
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     timeout http-request    30s
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     timeout connect         30s
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     timeout client          32s
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     timeout server          32s
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     timeout http-keep-alive 30s
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: listen listener
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     bind 169.254.169.254:80
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     server metadata /var/lib/neutron/metadata_proxy
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:     http-request add-header X-OVN-Network-ID f2187c16-3ad9-4fc6-892a-d36a6262d4d0
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 10 10:18:16 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.136 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'env', 'PROCESS_TAG=haproxy-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 10 10:18:16 compute-2 ceph-mon[74913]: pgmap v969: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:16 compute-2 podman[245495]: 2025-10-10 10:18:16.489941313 +0000 UTC m=+0.051034275 container create 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:18:16 compute-2 systemd[1]: Started libpod-conmon-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope.
Oct 10 10:18:16 compute-2 podman[245495]: 2025-10-10 10:18:16.46362223 +0000 UTC m=+0.024715212 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 10:18:16 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:18:16 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02920de339ab3d96609b58c8fc65fe45954dd283163f9dcf7301f5c71f47af34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 10:18:16 compute-2 podman[245495]: 2025-10-10 10:18:16.58228024 +0000 UTC m=+0.143373212 container init 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 10:18:16 compute-2 podman[245495]: 2025-10-10 10:18:16.589863762 +0000 UTC m=+0.150956714 container start 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 10:18:16 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : New worker (245517) forked
Oct 10 10:18:16 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : Loading success.
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.627 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091496.6262186, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.627 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Started (Lifecycle Event)
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.656 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.662 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091496.627454, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.663 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Paused (Lifecycle Event)
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.684 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.689 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:18:16 compute-2 nova_compute[235775]: 2025-10-10 10:18:16.716 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:17.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.911 2 DEBUG nova.compute.manager [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.911 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG nova.compute.manager [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Processing event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.913 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.916 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091497.9161136, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.916 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Resumed (Lifecycle Event)
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.918 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.922 2 INFO nova.virt.libvirt.driver [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance spawned successfully.
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.922 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.942 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.951 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.958 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.959 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.960 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.960 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.961 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.962 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:18:17 compute-2 nova_compute[235775]: 2025-10-10 10:18:17.974 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:18:18 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:18.023 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:18:18 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:18.024 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:18:18 compute-2 nova_compute[235775]: 2025-10-10 10:18:18.025 2 INFO nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 8.84 seconds to spawn the instance on the hypervisor.
Oct 10 10:18:18 compute-2 nova_compute[235775]: 2025-10-10 10:18:18.026 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:18:18 compute-2 nova_compute[235775]: 2025-10-10 10:18:18.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:18 compute-2 nova_compute[235775]: 2025-10-10 10:18:18.096 2 INFO nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 9.82 seconds to build instance.
Oct 10 10:18:18 compute-2 nova_compute[235775]: 2025-10-10 10:18:18.121 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:18 compute-2 ceph-mon[74913]: pgmap v970: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:19.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:19 compute-2 nova_compute[235775]: 2025-10-10 10:18:19.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:19 compute-2 sudo[245529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:18:19 compute-2 sudo[245529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:18:19 compute-2 sudo[245529]: pam_unix(sudo:session): session closed for user root
Oct 10 10:18:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.000 2 DEBUG nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.000 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.001 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.001 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.002 2 DEBUG nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:18:20 compute-2 nova_compute[235775]: 2025-10-10 10:18:20.002 2 WARNING nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received unexpected event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d for instance with vm_state active and task_state None.
Oct 10 10:18:20 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:20.027 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:20 compute-2 ceph-mon[74913]: pgmap v971: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Oct 10 10:18:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:20.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:21.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:22 compute-2 ceph-mon[74913]: pgmap v972: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Oct 10 10:18:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:22 compute-2 nova_compute[235775]: 2025-10-10 10:18:22.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.073 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.074 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.074 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.075 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.075 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.077 2 INFO nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Terminating instance
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.079 2 DEBUG nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 10 10:18:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:23.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:23 compute-2 kernel: tap864e1646-5a (unregistering): left promiscuous mode
Oct 10 10:18:23 compute-2 NetworkManager[44866]: <info>  [1760091503.1188] device (tap864e1646-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 10:18:23 compute-2 ovn_controller[132503]: 2025-10-10T10:18:23Z|00043|binding|INFO|Releasing lport 864e1646-5abd-4268-a80a-c224425c842d from this chassis (sb_readonly=0)
Oct 10 10:18:23 compute-2 ovn_controller[132503]: 2025-10-10T10:18:23Z|00044|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d down in Southbound
Oct 10 10:18:23 compute-2 ovn_controller[132503]: 2025-10-10T10:18:23Z|00045|binding|INFO|Removing iface tap864e1646-5a ovn-installed in OVS
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.133 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:de:db 10.100.0.4'], port_security=['fa:16:3e:19:de:db 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '49a68fc9-f469-4827-9bb8-f2c2981d2b68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '9', 'neutron:security_group_ids': '79abf760-0fb0-448c-b5c8-75027ac31ae3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58a83406-32bd-40d9-b3dd-ed56e38abb09, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=864e1646-5abd-4268-a80a-c224425c842d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.134 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 864e1646-5abd-4268-a80a-c224425c842d in datapath f2187c16-3ad9-4fc6-892a-d36a6262d4d0 unbound from our chassis
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.135 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.136 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4eca5dc8-6509-42ee-a65b-85705c974ee9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.136 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 namespace which is not needed anymore
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 10 10:18:23 compute-2 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Consumed 6.032s CPU time.
Oct 10 10:18:23 compute-2 systemd-machined[192768]: Machine qemu-2-instance-00000009 terminated.
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : haproxy version is 2.8.14-c23fe91
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : path to executable is /usr/sbin/haproxy
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : Exiting Master process...
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : Exiting Master process...
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [ALERT]    (245514) : Current worker (245517) exited with code 143 (Terminated)
Oct 10 10:18:23 compute-2 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : All workers exited. Exiting... (0)
Oct 10 10:18:23 compute-2 systemd[1]: libpod-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope: Deactivated successfully.
Oct 10 10:18:23 compute-2 podman[245582]: 2025-10-10 10:18:23.256878559 +0000 UTC m=+0.038074430 container died 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:18:23 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca-userdata-shm.mount: Deactivated successfully.
Oct 10 10:18:23 compute-2 systemd[1]: var-lib-containers-storage-overlay-02920de339ab3d96609b58c8fc65fe45954dd283163f9dcf7301f5c71f47af34-merged.mount: Deactivated successfully.
Oct 10 10:18:23 compute-2 podman[245582]: 2025-10-10 10:18:23.289099761 +0000 UTC m=+0.070295612 container cleanup 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:18:23 compute-2 systemd[1]: libpod-conmon-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope: Deactivated successfully.
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.315 2 INFO nova.virt.libvirt.driver [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance destroyed successfully.
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.316 2 DEBUG nova.objects.instance [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.339 2 DEBUG nova.virt.libvirt.vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:18:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:18:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.339 2 DEBUG nova.network.os_vif_util [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.340 2 DEBUG nova.network.os_vif_util [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.340 2 DEBUG os_vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap864e1646-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.346 2 INFO os_vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a')
Oct 10 10:18:23 compute-2 podman[245613]: 2025-10-10 10:18:23.356868011 +0000 UTC m=+0.044469025 container remove 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.362 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[41584cba-21c7-4ac6-aedc-e82787b8001e]: (4, ('Fri Oct 10 10:18:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 (0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca)\n0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca\nFri Oct 10 10:18:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 (0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca)\n0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.364 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[34c7922b-abdf-4f2e-bcc1-ce881e3f60c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.365 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2187c16-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 kernel: tapf2187c16-30: left promiscuous mode
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.371 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f30d7ee1-fc79-43a9-be11-d84b11c523a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.401 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[1565ba04-aec2-4bde-b5a0-e75e2260d018]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.402 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f51322-4f75-4f64-ae68-0934207d6ed5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.415 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b44148eb-32db-4806-a959-a398203125f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439943, 'reachable_time': 38988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245652, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.417 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 10 10:18:23 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.417 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[b2753098-17f6-4f0a-a8bf-5fabe076e43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:18:23 compute-2 systemd[1]: run-netns-ovnmeta\x2df2187c16\x2d3ad9\x2d4fc6\x2d892a\x2dd36a6262d4d0.mount: Deactivated successfully.
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.439 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.440 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.442 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.698 2 INFO nova.virt.libvirt.driver [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deleting instance files /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68_del
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.700 2 INFO nova.virt.libvirt.driver [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deletion of /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68_del complete
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.753 2 INFO nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 0.67 seconds to destroy the instance on the hypervisor.
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.754 2 DEBUG oslo.service.loopingcall [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.755 2 DEBUG nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 10 10:18:23 compute-2 nova_compute[235775]: 2025-10-10 10:18:23.755 2 DEBUG nova.network.neutron [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 10 10:18:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:24 compute-2 ceph-mon[74913]: pgmap v973: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Oct 10 10:18:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:24.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.152 2 DEBUG nova.network.neutron [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.170 2 INFO nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 1.41 seconds to deallocate network for instance.
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.221 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.222 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.303 2 DEBUG oslo_concurrency.processutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.542 2 DEBUG nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.543 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.543 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.544 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.544 2 DEBUG nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.544 2 WARNING nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received unexpected event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d for instance with vm_state deleted and task_state None.
Oct 10 10:18:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:18:25 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1600136722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.787 2 DEBUG oslo_concurrency.processutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.795 2 DEBUG nova.compute.provider_tree [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.816 2 DEBUG nova.scheduler.client.report [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:18:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.844 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.893 2 INFO nova.scheduler.client.report [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance 49a68fc9-f469-4827-9bb8-f2c2981d2b68
Oct 10 10:18:25 compute-2 nova_compute[235775]: 2025-10-10 10:18:25.970 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:26 compute-2 ceph-mon[74913]: pgmap v974: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:18:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1600136722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:26.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:18:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:18:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:18:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:27.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:18:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:18:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:27 compute-2 nova_compute[235775]: 2025-10-10 10:18:27.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:28 compute-2 ceph-mon[74913]: pgmap v975: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:18:28 compute-2 nova_compute[235775]: 2025-10-10 10:18:28.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:28.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:29.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:30 compute-2 ceph-mon[74913]: pgmap v976: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 101 op/s
Oct 10 10:18:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:30.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:31.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:31 compute-2 ceph-mon[74913]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 91 op/s
Oct 10 10:18:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:32.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:32 compute-2 nova_compute[235775]: 2025-10-10 10:18:32.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:32 compute-2 nova_compute[235775]: 2025-10-10 10:18:32.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:32 compute-2 nova_compute[235775]: 2025-10-10 10:18:32.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:33.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:33 compute-2 nova_compute[235775]: 2025-10-10 10:18:33.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:33 compute-2 podman[245690]: 2025-10-10 10:18:33.804687511 +0000 UTC m=+0.063243517 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:18:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:33 compute-2 podman[245688]: 2025-10-10 10:18:33.836410106 +0000 UTC m=+0.098734342 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 10:18:33 compute-2 podman[245689]: 2025-10-10 10:18:33.839055872 +0000 UTC m=+0.095834780 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 10 10:18:34 compute-2 ceph-mon[74913]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.2 KiB/s wr, 91 op/s
Oct 10 10:18:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:34.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:35.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:36 compute-2 ceph-mon[74913]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Oct 10 10:18:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:18:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:36.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:37 compute-2 nova_compute[235775]: 2025-10-10 10:18:37.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:38 compute-2 ceph-mon[74913]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Oct 10 10:18:38 compute-2 nova_compute[235775]: 2025-10-10 10:18:38.313 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091503.3117325, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:18:38 compute-2 nova_compute[235775]: 2025-10-10 10:18:38.314 2 INFO nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Stopped (Lifecycle Event)
Oct 10 10:18:38 compute-2 nova_compute[235775]: 2025-10-10 10:18:38.341 2 DEBUG nova.compute.manager [None req-c5ab02e4-6a6d-4654-861d-36fca741e53c - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:18:38 compute-2 nova_compute[235775]: 2025-10-10 10:18:38.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:38.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:39.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:39 compute-2 sudo[245761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:18:39 compute-2 sudo[245761]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:18:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:39 compute-2 sudo[245761]: pam_unix(sudo:session): session closed for user root
Oct 10 10:18:40 compute-2 ceph-mon[74913]: pgmap v981: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Oct 10 10:18:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:40.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:41.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.473 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:42 compute-2 ceph-mon[74913]: pgmap v982: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:18:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:42.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:42 compute-2 nova_compute[235775]: 2025-10-10 10:18:42.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:43 compute-2 nova_compute[235775]: 2025-10-10 10:18:43.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:44 compute-2 ceph-mon[74913]: pgmap v983: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:18:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:44.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:44 compute-2 podman[245791]: 2025-10-10 10:18:44.774125874 +0000 UTC m=+0.049686393 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:18:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:45.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:46 compute-2 ceph-mon[74913]: pgmap v984: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:18:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3655689645' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:18:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:47 compute-2 nova_compute[235775]: 2025-10-10 10:18:47.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:48 compute-2 ceph-mon[74913]: pgmap v985: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:18:48 compute-2 nova_compute[235775]: 2025-10-10 10:18:48.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:50 compute-2 ceph-mon[74913]: pgmap v986: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:52 compute-2 ceph-mon[74913]: pgmap v987: 353 pgs: 353 active+clean; 88 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:52 compute-2 nova_compute[235775]: 2025-10-10 10:18:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:52 compute-2 nova_compute[235775]: 2025-10-10 10:18:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:52 compute-2 nova_compute[235775]: 2025-10-10 10:18:52.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:53 compute-2 nova_compute[235775]: 2025-10-10 10:18:53.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:54 compute-2 ceph-mon[74913]: pgmap v988: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/434918906' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3376492106' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:18:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:18:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.844 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.844 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.845 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:18:55 compute-2 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:18:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:18:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/475021902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:56 compute-2 ceph-mon[74913]: pgmap v989: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/475021902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.337 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:56.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.542 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.544 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4900MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.544 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.545 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.627 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.628 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:18:56 compute-2 nova_compute[235775]: 2025-10-10 10:18:56.644 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:18:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3962479693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.073 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.078 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.105 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.129 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.129 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:18:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:57.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3962479693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:18:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:57 compute-2 nova_compute[235775]: 2025-10-10 10:18:57.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:58 compute-2 nova_compute[235775]: 2025-10-10 10:18:58.099 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:58 compute-2 ceph-mon[74913]: pgmap v990: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:18:58 compute-2 nova_compute[235775]: 2025-10-10 10:18:58.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:18:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:18:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:58.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:18:58 compute-2 nova_compute[235775]: 2025-10-10 10:18:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:58 compute-2 nova_compute[235775]: 2025-10-10 10:18:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:58 compute-2 nova_compute[235775]: 2025-10-10 10:18:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:18:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:18:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:18:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:59.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:18:59 compute-2 nova_compute[235775]: 2025-10-10 10:18:59.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:18:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:18:59 compute-2 sudo[245870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:18:59 compute-2 sudo[245870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:18:59 compute-2 sudo[245870]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:00 compute-2 ceph-mon[74913]: pgmap v991: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Oct 10 10:19:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/330076369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:00.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:01.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1712929211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:02 compute-2 ceph-mon[74913]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 8.4 KiB/s rd, 12 KiB/s wr, 10 op/s
Oct 10 10:19:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:02.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:02 compute-2 nova_compute[235775]: 2025-10-10 10:19:02.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:03 compute-2 sudo[245899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:19:03 compute-2 sudo[245899]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:03 compute-2 sudo[245899]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:03.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:03 compute-2 sudo[245924]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 10:19:03 compute-2 sudo[245924]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:03 compute-2 nova_compute[235775]: 2025-10-10 10:19:03.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3651707172' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:03 compute-2 ceph-mon[74913]: pgmap v993: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 75 op/s
Oct 10 10:19:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4015442307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:03 compute-2 podman[246021]: 2025-10-10 10:19:03.720351498 +0000 UTC m=+0.053309608 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 10:19:03 compute-2 podman[246021]: 2025-10-10 10:19:03.812186919 +0000 UTC m=+0.145145009 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True)
Oct 10 10:19:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:03 compute-2 podman[246055]: 2025-10-10 10:19:03.942642906 +0000 UTC m=+0.060530570 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:19:03 compute-2 podman[246058]: 2025-10-10 10:19:03.954734123 +0000 UTC m=+0.075423186 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 10:19:04 compute-2 podman[246060]: 2025-10-10 10:19:04.009962252 +0000 UTC m=+0.129763776 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:19:04 compute-2 podman[246203]: 2025-10-10 10:19:04.238444407 +0000 UTC m=+0.052002606 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:19:04 compute-2 podman[246203]: 2025-10-10 10:19:04.269127899 +0000 UTC m=+0.082686098 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:19:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:04.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:04 compute-2 podman[246293]: 2025-10-10 10:19:04.577951098 +0000 UTC m=+0.052517944 container exec eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Oct 10 10:19:04 compute-2 podman[246293]: 2025-10-10 10:19:04.58615481 +0000 UTC m=+0.060721656 container exec_died eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Oct 10 10:19:04 compute-2 podman[246357]: 2025-10-10 10:19:04.765348758 +0000 UTC m=+0.047490922 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:19:04 compute-2 podman[246357]: 2025-10-10 10:19:04.774104278 +0000 UTC m=+0.056246342 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:19:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:04 compute-2 podman[246423]: 2025-10-10 10:19:04.994547286 +0000 UTC m=+0.071845421 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, architecture=x86_64, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, name=keepalived, release=1793, version=2.2.4, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 10:19:05 compute-2 podman[246423]: 2025-10-10 10:19:05.01531154 +0000 UTC m=+0.092609655 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, architecture=x86_64, name=keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Oct 10 10:19:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:05.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:05 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:05 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:05 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 10:19:05 compute-2 sudo[245924]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:05 compute-2 sudo[246493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:19:05 compute-2 sudo[246493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:05 compute-2 sudo[246493]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:05 compute-2 sudo[246518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:19:05 compute-2 sudo[246518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:05 compute-2 sudo[246518]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:06 compute-2 sudo[246574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:19:06 compute-2 sudo[246574]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:06 compute-2 sudo[246574]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:06 compute-2 sudo[246599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Oct 10 10:19:06 compute-2 sudo[246599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:06 compute-2 ceph-mon[74913]: pgmap v994: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:19:06 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:06 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:06 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:06 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:06 compute-2 sudo[246599]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:07.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:19:07 compute-2 ceph-mon[74913]: pgmap v995: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:19:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:07 compute-2 nova_compute[235775]: 2025-10-10 10:19:07.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:07 compute-2 ovn_controller[132503]: 2025-10-10T10:19:07Z|00046|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 10 10:19:08 compute-2 nova_compute[235775]: 2025-10-10 10:19:08.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:08.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:09.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:10 compute-2 ceph-mon[74913]: pgmap v996: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 76 op/s
Oct 10 10:19:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:10.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:11.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:12 compute-2 sudo[246651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:19:12 compute-2 sudo[246651]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:12 compute-2 sudo[246651]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:12 compute-2 ceph-mon[74913]: pgmap v997: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 65 op/s
Oct 10 10:19:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:19:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:12.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:12 compute-2 nova_compute[235775]: 2025-10-10 10:19:12.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:13 compute-2 nova_compute[235775]: 2025-10-10 10:19:13.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:14 compute-2 ceph-mon[74913]: pgmap v998: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 126 op/s
Oct 10 10:19:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:14.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:15.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:15 compute-2 podman[246680]: 2025-10-10 10:19:15.814778919 +0000 UTC m=+0.073590448 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:19:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:16 compute-2 ceph-mon[74913]: pgmap v999: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Oct 10 10:19:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:16.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:17.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:19:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:17 compute-2 nova_compute[235775]: 2025-10-10 10:19:17.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:18 compute-2 ceph-mon[74913]: pgmap v1000: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Oct 10 10:19:18 compute-2 nova_compute[235775]: 2025-10-10 10:19:18.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:18.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:19.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:20 compute-2 sudo[246703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:19:20 compute-2 sudo[246703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:20 compute-2 sudo[246703]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:20 compute-2 ceph-mon[74913]: pgmap v1001: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 307 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:19:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:20.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:21.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:22 compute-2 ceph-mon[74913]: pgmap v1002: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 303 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Oct 10 10:19:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:22.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:22 compute-2 nova_compute[235775]: 2025-10-10 10:19:22.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:22 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:22.533 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:19:22 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:22.536 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:19:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:22 compute-2 nova_compute[235775]: 2025-10-10 10:19:22.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:19:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:23.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:19:23 compute-2 nova_compute[235775]: 2025-10-10 10:19:23.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:24 compute-2 ceph-mon[74913]: pgmap v1003: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 304 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Oct 10 10:19:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:24.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:25.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2257682561' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:26 compute-2 ceph-mon[74913]: pgmap v1004: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 16 KiB/s wr, 1 op/s
Oct 10 10:19:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:26.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:27.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1758263211' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:19:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1758263211' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:19:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:27 compute-2 nova_compute[235775]: 2025-10-10 10:19:27.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:28 compute-2 ceph-mon[74913]: pgmap v1005: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 16 KiB/s wr, 1 op/s
Oct 10 10:19:28 compute-2 nova_compute[235775]: 2025-10-10 10:19:28.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:28.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:28 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:28.538 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:29.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:30 compute-2 ceph-mon[74913]: pgmap v1006: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 17 KiB/s wr, 30 op/s
Oct 10 10:19:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:30.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:31.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:31 compute-2 ceph-mon[74913]: pgmap v1007: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 6.5 KiB/s wr, 29 op/s
Oct 10 10:19:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:32 compute-2 nova_compute[235775]: 2025-10-10 10:19:32.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:33.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:33 compute-2 nova_compute[235775]: 2025-10-10 10:19:33.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:34 compute-2 ceph-mon[74913]: pgmap v1008: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 6.5 KiB/s wr, 29 op/s
Oct 10 10:19:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:34.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:34 compute-2 podman[246745]: 2025-10-10 10:19:34.817279583 +0000 UTC m=+0.085274471 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:19:34 compute-2 podman[246746]: 2025-10-10 10:19:34.821999994 +0000 UTC m=+0.088443293 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid)
Oct 10 10:19:34 compute-2 podman[246744]: 2025-10-10 10:19:34.823703449 +0000 UTC m=+0.095053005 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 10:19:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:35.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:36 compute-2 ceph-mon[74913]: pgmap v1009: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Oct 10 10:19:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:36.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:37.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:37 compute-2 nova_compute[235775]: 2025-10-10 10:19:37.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:38 compute-2 ceph-mon[74913]: pgmap v1010: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Oct 10 10:19:38 compute-2 nova_compute[235775]: 2025-10-10 10:19:38.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:38.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:39.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:40 compute-2 sudo[246809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:19:40 compute-2 sudo[246809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:19:40 compute-2 sudo[246809]: pam_unix(sudo:session): session closed for user root
Oct 10 10:19:40 compute-2 ceph-mon[74913]: pgmap v1011: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 29 op/s
Oct 10 10:19:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:40.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:41.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:42 compute-2 ceph-mon[74913]: pgmap v1012: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:19:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:42.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:42 compute-2 nova_compute[235775]: 2025-10-10 10:19:42.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:43.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:43 compute-2 nova_compute[235775]: 2025-10-10 10:19:43.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:44 compute-2 ceph-mon[74913]: pgmap v1013: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:19:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:44.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:45.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:46 compute-2 ceph-mon[74913]: pgmap v1014: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:19:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:46.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.770 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.770 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:46 compute-2 podman[246841]: 2025-10-10 10:19:46.783635366 +0000 UTC m=+0.056637975 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.787 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.869 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.870 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.876 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.877 2 INFO nova.compute.claims [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Claim successful on node compute-2.ctlplane.example.com
Oct 10 10:19:46 compute-2 nova_compute[235775]: 2025-10-10 10:19:46.973 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:47.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:19:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:19:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2667140307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.410 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.416 2 DEBUG nova.compute.provider_tree [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.436 2 DEBUG nova.scheduler.client.report [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.474 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.474 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.549 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.550 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.579 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.605 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.734 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.735 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.736 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating image(s)
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.757 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.785 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.809 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.813 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.838 2 DEBUG nova.policy [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Oct 10 10:19:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.885 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.885 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.886 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.886 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.908 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.912 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:47 compute-2 nova_compute[235775]: 2025-10-10 10:19:47.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.132 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.188 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.276 2 DEBUG nova.objects.instance [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:19:48 compute-2 ceph-mon[74913]: pgmap v1015: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:19:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2667140307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.289 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Ensure instance console log exists: /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:48 compute-2 nova_compute[235775]: 2025-10-10 10:19:48.431 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Successfully created port: 7369f952-1f44-445c-9449-347d6d476d79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Oct 10 10:19:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:48.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:49.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.357 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Successfully updated port: 7369f952-1f44-445c-9449-347d6d476d79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.378 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.379 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.379 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.438 2 DEBUG nova.compute.manager [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.438 2 DEBUG nova.compute.manager [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.439 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:19:49 compute-2 nova_compute[235775]: 2025-10-10 10:19:49.533 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Oct 10 10:19:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:50 compute-2 ceph-mon[74913]: pgmap v1016: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:19:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:50.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.592 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.615 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.615 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance network_info: |[{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.616 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.617 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.622 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start _get_guest_xml network_info=[{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.628 2 WARNING nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.633 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.634 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.648 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.649 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.649 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.650 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.651 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.651 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.652 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.652 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.653 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.653 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.654 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.654 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.655 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.655 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Oct 10 10:19:50 compute-2 nova_compute[235775]: 2025-10-10 10:19:50.659 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:19:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4063882823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.148 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.176 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.180 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:51.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4063882823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:19:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 10:19:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1076408673' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.618 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.619 2 DEBUG nova.virt.libvirt.vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:19:47Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.619 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.620 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.621 2 DEBUG nova.objects.instance [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.641 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] End _get_guest_xml xml=<domain type="kvm">
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <uuid>4fd38b02-f79c-4eb5-9939-6939dda28a15</uuid>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <name>instance-0000000b</name>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <memory>131072</memory>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <vcpu>1</vcpu>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <metadata>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:name>tempest-TestNetworkBasicOps-server-900143833</nova:name>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:creationTime>2025-10-10 10:19:50</nova:creationTime>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:flavor name="m1.nano">
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:memory>128</nova:memory>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:disk>1</nova:disk>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:swap>0</nova:swap>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:ephemeral>0</nova:ephemeral>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:vcpus>1</nova:vcpus>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </nova:flavor>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:owner>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </nova:owner>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <nova:ports>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <nova:port uuid="7369f952-1f44-445c-9449-347d6d476d79">
Oct 10 10:19:51 compute-2 nova_compute[235775]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         </nova:port>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </nova:ports>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </nova:instance>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </metadata>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <sysinfo type="smbios">
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <system>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="manufacturer">RDO</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="product">OpenStack Compute</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="serial">4fd38b02-f79c-4eb5-9939-6939dda28a15</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="uuid">4fd38b02-f79c-4eb5-9939-6939dda28a15</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <entry name="family">Virtual Machine</entry>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </system>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </sysinfo>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <os>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <boot dev="hd"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <smbios mode="sysinfo"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </os>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <features>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <acpi/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <apic/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <vmcoreinfo/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </features>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <clock offset="utc">
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <timer name="pit" tickpolicy="delay"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <timer name="hpet" present="no"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </clock>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <cpu mode="host-model" match="exact">
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <topology sockets="1" cores="1" threads="1"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </cpu>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   <devices>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <disk type="network" device="disk">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/4fd38b02-f79c-4eb5-9939-6939dda28a15_disk">
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </source>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <target dev="vda" bus="virtio"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <disk type="network" device="cdrom">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <driver type="raw" cache="none"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <source protocol="rbd" name="vms/4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config">
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.100" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.102" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <host name="192.168.122.101" port="6789"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </source>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <auth username="openstack">
Oct 10 10:19:51 compute-2 nova_compute[235775]:         <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       </auth>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <target dev="sda" bus="sata"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </disk>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <interface type="ethernet">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <mac address="fa:16:3e:54:86:3b"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <driver name="vhost" rx_queue_size="512"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <mtu size="1442"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <target dev="tap7369f952-1f"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </interface>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <serial type="pty">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <log file="/var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/console.log" append="off"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </serial>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <video>
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <model type="virtio"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </video>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <input type="tablet" bus="usb"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <rng model="virtio">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <backend model="random">/dev/urandom</backend>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </rng>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="pci" model="pcie-root-port"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <controller type="usb" index="0"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     <memballoon model="virtio">
Oct 10 10:19:51 compute-2 nova_compute[235775]:       <stats period="10"/>
Oct 10 10:19:51 compute-2 nova_compute[235775]:     </memballoon>
Oct 10 10:19:51 compute-2 nova_compute[235775]:   </devices>
Oct 10 10:19:51 compute-2 nova_compute[235775]: </domain>
Oct 10 10:19:51 compute-2 nova_compute[235775]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Preparing to wait for external event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.virt.libvirt.vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:19:47Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.645 2 DEBUG os_vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7369f952-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7369f952-1f, col_values=(('external_ids', {'iface-id': '7369f952-1f44-445c-9449-347d6d476d79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:86:3b', 'vm-uuid': '4fd38b02-f79c-4eb5-9939-6939dda28a15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:51 compute-2 NetworkManager[44866]: <info>  [1760091591.6527] manager: (tap7369f952-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.658 2 INFO os_vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f')
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.704 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.705 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.705 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:54:86:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.706 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Using config drive
Oct 10 10:19:51 compute-2 nova_compute[235775]: 2025-10-10 10:19:51.730 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.129 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating config drive at /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.138 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj6ye7zz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.265 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj6ye7zz" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.308 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.314 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:19:52 compute-2 ceph-mon[74913]: pgmap v1017: 353 pgs: 353 active+clean; 41 MiB data, 279 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:19:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1076408673' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.431 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.434 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.454 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.499 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.500 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deleting local config drive /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config because it was imported into RBD.
Oct 10 10:19:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:52.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:52 compute-2 kernel: tap7369f952-1f: entered promiscuous mode
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.5602] manager: (tap7369f952-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_controller[132503]: 2025-10-10T10:19:52Z|00047|binding|INFO|Claiming lport 7369f952-1f44-445c-9449-347d6d476d79 for this chassis.
Oct 10 10:19:52 compute-2 ovn_controller[132503]: 2025-10-10T10:19:52Z|00048|binding|INFO|7369f952-1f44-445c-9449-347d6d476d79: Claiming fa:16:3e:54:86:3b 10.100.0.5
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.580 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:86:3b 10.100.0.5'], port_security=['fa:16:3e:54:86:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4fd38b02-f79c-4eb5-9939-6939dda28a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '26e88f36-7c05-4376-877b-78cbbe604817', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc36a9e4-a12c-4b9d-8968-49f72bde3476, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=7369f952-1f44-445c-9449-347d6d476d79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.581 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 7369f952-1f44-445c-9449-347d6d476d79 in datapath fb3e50c5-fe48-4113-87d7-4e11945ac752 bound to our chassis
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.583 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb3e50c5-fe48-4113-87d7-4e11945ac752
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.595 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a5209d-f839-4c54-81c9-82e770fef56f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.596 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb3e50c5-f1 in ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.599 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb3e50c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.599 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b428f4ac-b291-4d0b-85b0-78afdce8601d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.600 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[40656614-5900-4c3b-8a63-141b3db6bc29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 systemd-udevd[247191]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 10:19:52 compute-2 systemd-machined[192768]: New machine qemu-3-instance-0000000b.
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.617 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[f19818bc-6889-4b75-a21a-5ccfa8fea535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.6319] device (tap7369f952-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.6329] device (tap7369f952-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 10:19:52 compute-2 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.647 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[d41a17c2-2079-4b32-8fe8-6c553e1d97ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_controller[132503]: 2025-10-10T10:19:52Z|00049|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 ovn-installed in OVS
Oct 10 10:19:52 compute-2 ovn_controller[132503]: 2025-10-10T10:19:52Z|00050|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 up in Southbound
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.682 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[00057ab0-6ced-44cf-901c-e736930d4965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.6889] manager: (tapfb3e50c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.689 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b302f714-7a70-42fb-af8d-2d611b66b34f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.715 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[3bee4312-9e2f-4e71-b973-115af44bdada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.718 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6eccf4-9045-412b-abd6-19b384038d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.7449] device (tapfb3e50c5-f0): carrier: link connected
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.749 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[483e18c8-7084-4dea-9d2b-46fdd9a0f8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.764 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c009c4-838f-428d-a322-4d5c739aab2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb3e50c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:c3:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449630, 'reachable_time': 40556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247224, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.783 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4360f4ee-37f7-4d15-9dfe-c24f456516a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:c3b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449630, 'tstamp': 449630}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247226, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.801 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e26675-d92c-4d42-b87c-65eb1a389e9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb3e50c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:c3:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449630, 'reachable_time': 40556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247227, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.841 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4615adff-cb38-4d11-9d41-e92cc98e31eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.915 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[47c9888a-7067-420d-9416-027c9225bb9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.916 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb3e50c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.917 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.918 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb3e50c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:52 compute-2 NetworkManager[44866]: <info>  [1760091592.9207] manager: (tapfb3e50c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 10 10:19:52 compute-2 kernel: tapfb3e50c5-f0: entered promiscuous mode
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.931 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb3e50c5-f0, col_values=(('external_ids', {'iface-id': '50744b55-fb9e-4bc1-a3e6-4ad27846c672'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 ovn_controller[132503]: 2025-10-10T10:19:52Z|00051|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.935 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.938 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0df315-2f83-4249-ad8c-d81ae6411d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.939 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: global
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     log         /dev/log local0 debug
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     log-tag     haproxy-metadata-proxy-fb3e50c5-fe48-4113-87d7-4e11945ac752
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     user        root
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     group       root
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     maxconn     1024
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     pidfile     /var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     daemon
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: defaults
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     log global
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     mode http
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     option httplog
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     option dontlognull
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     option http-server-close
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     option forwardfor
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     retries                 3
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     timeout http-request    30s
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     timeout connect         30s
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     timeout client          32s
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     timeout server          32s
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     timeout http-keep-alive 30s
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: listen listener
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     bind 169.254.169.254:80
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     server metadata /var/lib/neutron/metadata_proxy
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:     http-request add-header X-OVN-Network-ID fb3e50c5-fe48-4113-87d7-4e11945ac752
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Oct 10 10:19:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.940 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'env', 'PROCESS_TAG=haproxy-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb3e50c5-fe48-4113-87d7-4e11945ac752.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:52 compute-2 nova_compute[235775]: 2025-10-10 10:19:52.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.046 2 DEBUG nova.compute.manager [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.046 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG nova.compute.manager [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Processing event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Oct 10 10:19:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:19:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:53.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:19:53 compute-2 podman[247302]: 2025-10-10 10:19:53.316491556 +0000 UTC m=+0.045393463 container create c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 10:19:53 compute-2 systemd[1]: Started libpod-conmon-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope.
Oct 10 10:19:53 compute-2 podman[247302]: 2025-10-10 10:19:53.293126089 +0000 UTC m=+0.022028026 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 10:19:53 compute-2 systemd[1]: Started libcrun container.
Oct 10 10:19:53 compute-2 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f11d884e8a0583bed848a92a7b64922d9783a2c80338a45ab3d568340bb2fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 10:19:53 compute-2 podman[247302]: 2025-10-10 10:19:53.412325475 +0000 UTC m=+0.141227392 container init c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:19:53 compute-2 podman[247302]: 2025-10-10 10:19:53.416909852 +0000 UTC m=+0.145811759 container start c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 10 10:19:53 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : New worker (247324) forked
Oct 10 10:19:53 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : Loading success.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.518 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5181386, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.519 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Started (Lifecycle Event)
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.520 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.523 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.526 2 INFO nova.virt.libvirt.driver [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance spawned successfully.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.526 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.545 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.550 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.554 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.554 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.556 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.583 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.583 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5183418, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.584 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Paused (Lifecycle Event)
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.606 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.609 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5228786, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.609 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Resumed (Lifecycle Event)
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.611 2 INFO nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 5.88 seconds to spawn the instance on the hypervisor.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.611 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.637 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.639 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.667 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] During sync_power_state the instance has a pending task (spawning). Skip.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.680 2 INFO nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 6.84 seconds to build instance.
Oct 10 10:19:53 compute-2 nova_compute[235775]: 2025-10-10 10:19:53.697 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:54 compute-2 ceph-mon[74913]: pgmap v1018: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:19:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:54.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:54 compute-2 nova_compute[235775]: 2025-10-10 10:19:54.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:19:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.135 2 DEBUG nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.136 2 WARNING nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.
Oct 10 10:19:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:19:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:55.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:19:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:19:55 compute-2 ovn_controller[132503]: 2025-10-10T10:19:55Z|00052|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:55 compute-2 NetworkManager[44866]: <info>  [1760091595.6649] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct 10 10:19:55 compute-2 NetworkManager[44866]: <info>  [1760091595.6663] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:55 compute-2 ovn_controller[132503]: 2025-10-10T10:19:55Z|00053|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:55 compute-2 nova_compute[235775]: 2025-10-10 10:19:55.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:19:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.123 2 DEBUG nova.compute.manager [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG nova.compute.manager [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:19:56 compute-2 ceph-mon[74913]: pgmap v1019: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:19:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:56.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:56 compute-2 nova_compute[235775]: 2025-10-10 10:19:56.993 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:19:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:57.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:57 compute-2 nova_compute[235775]: 2025-10-10 10:19:57.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.090 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.090 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG nova.network.neutron [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Oct 10 10:19:58 compute-2 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG nova.objects.instance [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:19:58 compute-2 ceph-mon[74913]: pgmap v1020: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:19:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:58.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:19:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:19:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:59.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:19:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:19:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:00 compute-2 sudo[247340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:20:00 compute-2 sudo[247340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:00 compute-2 sudo[247340]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:00 compute-2 ceph-mon[74913]: pgmap v1021: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Oct 10 10:20:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3969596316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:00 compute-2 ceph-mon[74913]: overall HEALTH_OK
Oct 10 10:20:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:00.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:01.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3574646968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:20:01 compute-2 nova_compute[235775]: 2025-10-10 10:20:01.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.205 2 DEBUG nova.network.neutron [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.223 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.252 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.252 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:20:02 compute-2 ceph-mon[74913]: pgmap v1022: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Oct 10 10:20:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:02.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:20:02 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2835618541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.702 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.799 2 DEBUG nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.799 2 DEBUG nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Oct 10 10:20:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.965 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.967 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4639MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.967 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.968 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:02 compute-2 nova_compute[235775]: 2025-10-10 10:20:02.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.042 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Instance 4fd38b02-f79c-4eb5-9939-6939dda28a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.043 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.043 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.062 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.087 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.088 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.106 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.130 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.170 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:20:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:03.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2835618541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:03 compute-2 ceph-mon[74913]: pgmap v1023: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Oct 10 10:20:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:20:03 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1513687594' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.641 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.647 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.665 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.688 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:20:03 compute-2 nova_compute[235775]: 2025-10-10 10:20:03.689 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1396745541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1513687594' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/190815280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:04.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:05.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:05 compute-2 ceph-mon[74913]: pgmap v1024: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:20:05 compute-2 podman[247418]: 2025-10-10 10:20:05.790683469 +0000 UTC m=+0.060121106 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001)
Oct 10 10:20:05 compute-2 podman[247416]: 2025-10-10 10:20:05.79101943 +0000 UTC m=+0.062813113 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:20:05 compute-2 podman[247417]: 2025-10-10 10:20:05.840644119 +0000 UTC m=+0.112504334 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 10 10:20:05 compute-2 ovn_controller[132503]: 2025-10-10T10:20:05Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:86:3b 10.100.0.5
Oct 10 10:20:05 compute-2 ovn_controller[132503]: 2025-10-10T10:20:05Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:86:3b 10.100.0.5
Oct 10 10:20:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1617462293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:06.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:06 compute-2 nova_compute[235775]: 2025-10-10 10:20:06.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:07.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:07 compute-2 ceph-mon[74913]: pgmap v1025: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:20:07 compute-2 nova_compute[235775]: 2025-10-10 10:20:07.685 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:07 compute-2 nova_compute[235775]: 2025-10-10 10:20:07.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:08.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:09.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:10 compute-2 ceph-mon[74913]: pgmap v1026: 353 pgs: 353 active+clean; 167 MiB data, 339 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 164 op/s
Oct 10 10:20:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1482921149' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:20:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1189993806' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:20:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:20:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:10.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:11.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:11 compute-2 nova_compute[235775]: 2025-10-10 10:20:11.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:12 compute-2 sudo[247485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:20:12 compute-2 sudo[247485]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:12 compute-2 sudo[247485]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:12 compute-2 ceph-mon[74913]: pgmap v1027: 353 pgs: 353 active+clean; 167 MiB data, 339 MiB used, 60 GiB / 60 GiB avail; 347 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Oct 10 10:20:12 compute-2 sudo[247510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:20:12 compute-2 sudo[247510]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:12.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:12 compute-2 sudo[247510]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:12 compute-2 nova_compute[235775]: 2025-10-10 10:20:12.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:20:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:13.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:20:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:14 compute-2 ceph-mon[74913]: pgmap v1028: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 348 KiB/s rd, 3.9 MiB/s wr, 91 op/s
Oct 10 10:20:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:14.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:15.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:16 compute-2 ceph-mon[74913]: pgmap v1029: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 347 KiB/s rd, 3.9 MiB/s wr, 90 op/s
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:20:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:20:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:16 compute-2 nova_compute[235775]: 2025-10-10 10:20:16.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:17 compute-2 ceph-mon[74913]: pgmap v1030: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 378 KiB/s rd, 4.3 MiB/s wr, 98 op/s
Oct 10 10:20:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:20:17 compute-2 ceph-mon[74913]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Oct 10 10:20:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:17.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:17 compute-2 podman[247572]: 2025-10-10 10:20:17.778550578 +0000 UTC m=+0.050138256 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 10:20:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:18 compute-2 nova_compute[235775]: 2025-10-10 10:20:17.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:18.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:19 compute-2 ceph-mon[74913]: pgmap v1031: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 2.5 MiB/s rd, 4.3 MiB/s wr, 179 op/s
Oct 10 10:20:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:19.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:20 compute-2 sudo[247592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:20:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:20 compute-2 sudo[247592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:20 compute-2 sudo[247592]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:20.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:21 compute-2 sudo[247619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:20:21 compute-2 sudo[247619]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:21 compute-2 sudo[247619]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:21.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:21 compute-2 ceph-mon[74913]: pgmap v1032: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 28 KiB/s wr, 82 op/s
Oct 10 10:20:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:21 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:21 compute-2 nova_compute[235775]: 2025-10-10 10:20:21.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:22.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:23 compute-2 nova_compute[235775]: 2025-10-10 10:20:23.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:23.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:23 compute-2 ceph-mon[74913]: pgmap v1033: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 28 KiB/s wr, 82 op/s
Oct 10 10:20:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:24.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:25.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:25 compute-2 ceph-mon[74913]: pgmap v1034: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 15 KiB/s wr, 81 op/s
Oct 10 10:20:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:20:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:20:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:20:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:20:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:26.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:26 compute-2 nova_compute[235775]: 2025-10-10 10:20:26.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:27.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:27 compute-2 ceph-mon[74913]: pgmap v1035: 353 pgs: 353 active+clean; 167 MiB data, 340 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 15 KiB/s wr, 81 op/s
Oct 10 10:20:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:20:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:20:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:28 compute-2 nova_compute[235775]: 2025-10-10 10:20:28.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:29.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:29 compute-2 ceph-mon[74913]: pgmap v1036: 353 pgs: 353 active+clean; 200 MiB data, 364 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 139 op/s
Oct 10 10:20:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:30.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:31.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:31 compute-2 ceph-mon[74913]: pgmap v1037: 353 pgs: 353 active+clean; 200 MiB data, 364 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:20:31 compute-2 nova_compute[235775]: 2025-10-10 10:20:31.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:20:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:20:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:32.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:33 compute-2 nova_compute[235775]: 2025-10-10 10:20:33.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:33.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:33 compute-2 ceph-mon[74913]: pgmap v1038: 353 pgs: 353 active+clean; 200 MiB data, 364 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:20:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:34.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:35.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:35 compute-2 ceph-mon[74913]: pgmap v1039: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:36 compute-2 ceph-mon[74913]: pgmap v1040: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:20:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:20:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:20:36 compute-2 nova_compute[235775]: 2025-10-10 10:20:36.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:36 compute-2 podman[247660]: 2025-10-10 10:20:36.814147702 +0000 UTC m=+0.074334830 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 10:20:36 compute-2 podman[247661]: 2025-10-10 10:20:36.849930348 +0000 UTC m=+0.103348650 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 10:20:36 compute-2 podman[247662]: 2025-10-10 10:20:36.861195289 +0000 UTC m=+0.103094892 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 10:20:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:37.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:38 compute-2 nova_compute[235775]: 2025-10-10 10:20:38.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:38 compute-2 nova_compute[235775]: 2025-10-10 10:20:38.172 2 INFO nova.compute.manager [None req-6ab692d3-34c4-4952-aa17-bb34c3d87ab3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output
Oct 10 10:20:38 compute-2 nova_compute[235775]: 2025-10-10 10:20:38.180 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 10 10:20:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:38 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:38.889 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:20:38 compute-2 nova_compute[235775]: 2025-10-10 10:20:38.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:38 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:38.890 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:20:39 compute-2 nova_compute[235775]: 2025-10-10 10:20:39.078 2 DEBUG nova.compute.manager [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:39 compute-2 nova_compute[235775]: 2025-10-10 10:20:39.079 2 DEBUG nova.compute.manager [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:20:39 compute-2 nova_compute[235775]: 2025-10-10 10:20:39.079 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:20:39 compute-2 nova_compute[235775]: 2025-10-10 10:20:39.080 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:20:39 compute-2 nova_compute[235775]: 2025-10-10 10:20:39.080 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:20:39 compute-2 ceph-mon[74913]: pgmap v1041: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Oct 10 10:20:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:39.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:40 compute-2 nova_compute[235775]: 2025-10-10 10:20:40.091 2 INFO nova.compute.manager [None req-f097cfbc-f867-41d2-9e1f-60cca500be17 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output
Oct 10 10:20:40 compute-2 nova_compute[235775]: 2025-10-10 10:20:40.095 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 10 10:20:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:40 compute-2 sudo[247728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:20:40 compute-2 sudo[247728]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:20:40 compute-2 sudo[247728]: pam_unix(sudo:session): session closed for user root
Oct 10 10:20:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.183 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.183 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.184 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.185 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.185 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.186 2 WARNING nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.186 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.187 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.188 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.189 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.189 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.190 2 WARNING nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.
Oct 10 10:20:41 compute-2 ceph-mon[74913]: pgmap v1042: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 15 KiB/s wr, 1 op/s
Oct 10 10:20:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:20:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:41.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:20:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.476 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.476 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.477 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:41 compute-2 nova_compute[235775]: 2025-10-10 10:20:41.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:43 compute-2 ceph-mon[74913]: pgmap v1043: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 15 KiB/s wr, 1 op/s
Oct 10 10:20:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:43.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.378 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.379 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.404 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.653 2 DEBUG nova.compute.manager [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.654 2 DEBUG nova.compute.manager [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.655 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.655 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.656 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.830 2 INFO nova.compute.manager [None req-27ba2d02-e255-4faf-afc6-87da70d38f8a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output
Oct 10 10:20:43 compute-2 nova_compute[235775]: 2025-10-10 10:20:43.837 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Oct 10 10:20:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:44.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:44 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:44.892 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.140 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.140 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.161 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:20:45 compute-2 ceph-mon[74913]: pgmap v1044: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 7.5 KiB/s rd, 15 KiB/s wr, 2 op/s
Oct 10 10:20:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:45.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.773 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.774 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.774 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.775 2 WARNING nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.777 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:45 compute-2 nova_compute[235775]: 2025-10-10 10:20:45.777 2 WARNING nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:46.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:46 compute-2 nova_compute[235775]: 2025-10-10 10:20:46.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:47 compute-2 ceph-mon[74913]: pgmap v1045: 353 pgs: 353 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 6.7 KiB/s rd, 15 KiB/s wr, 2 op/s
Oct 10 10:20:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:20:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:47.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:48 compute-2 nova_compute[235775]: 2025-10-10 10:20:48.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2924327785' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:48.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:48 compute-2 podman[247762]: 2025-10-10 10:20:48.792622841 +0000 UTC m=+0.071257892 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 10 10:20:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:49 compute-2 ceph-mon[74913]: pgmap v1046: 353 pgs: 353 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 20 KiB/s wr, 30 op/s
Oct 10 10:20:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:49.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.216 2 DEBUG nova.compute.manager [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.216 2 DEBUG nova.compute.manager [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.287 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.289 2 INFO nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Terminating instance
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.290 2 DEBUG nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Oct 10 10:20:50 compute-2 kernel: tap7369f952-1f (unregistering): left promiscuous mode
Oct 10 10:20:50 compute-2 NetworkManager[44866]: <info>  [1760091650.3325] device (tap7369f952-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 10:20:50 compute-2 ovn_controller[132503]: 2025-10-10T10:20:50Z|00054|binding|INFO|Releasing lport 7369f952-1f44-445c-9449-347d6d476d79 from this chassis (sb_readonly=0)
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 ovn_controller[132503]: 2025-10-10T10:20:50Z|00055|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 down in Southbound
Oct 10 10:20:50 compute-2 ovn_controller[132503]: 2025-10-10T10:20:50Z|00056|binding|INFO|Removing iface tap7369f952-1f ovn-installed in OVS
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.349 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:86:3b 10.100.0.5'], port_security=['fa:16:3e:54:86:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4fd38b02-f79c-4eb5-9939-6939dda28a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '8', 'neutron:security_group_ids': '26e88f36-7c05-4376-877b-78cbbe604817', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc36a9e4-a12c-4b9d-8968-49f72bde3476, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=7369f952-1f44-445c-9449-347d6d476d79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.351 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 7369f952-1f44-445c-9449-347d6d476d79 in datapath fb3e50c5-fe48-4113-87d7-4e11945ac752 unbound from our chassis
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.352 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb3e50c5-fe48-4113-87d7-4e11945ac752, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.353 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b1952c-1ffb-4e8c-8c7e-3c3bc90cbd0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.354 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 namespace which is not needed anymore
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:50 compute-2 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 10 10:20:50 compute-2 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 13.896s CPU time.
Oct 10 10:20:50 compute-2 systemd-machined[192768]: Machine qemu-3-instance-0000000b terminated.
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : haproxy version is 2.8.14-c23fe91
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : path to executable is /usr/sbin/haproxy
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : Exiting Master process...
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : Exiting Master process...
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [ALERT]    (247322) : Current worker (247324) exited with code 143 (Terminated)
Oct 10 10:20:50 compute-2 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : All workers exited. Exiting... (0)
Oct 10 10:20:50 compute-2 systemd[1]: libpod-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope: Deactivated successfully.
Oct 10 10:20:50 compute-2 podman[247806]: 2025-10-10 10:20:50.490225056 +0000 UTC m=+0.041378995 container died c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:20:50 compute-2 kernel: tap7369f952-1f: entered promiscuous mode
Oct 10 10:20:50 compute-2 kernel: tap7369f952-1f (unregistering): left promiscuous mode
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3-userdata-shm.mount: Deactivated successfully.
Oct 10 10:20:50 compute-2 systemd[1]: var-lib-containers-storage-overlay-46f11d884e8a0583bed848a92a7b64922d9783a2c80338a45ab3d568340bb2fc-merged.mount: Deactivated successfully.
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.527 2 INFO nova.virt.libvirt.driver [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance destroyed successfully.
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.527 2 DEBUG nova.objects.instance [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Oct 10 10:20:50 compute-2 podman[247806]: 2025-10-10 10:20:50.532042475 +0000 UTC m=+0.083196414 container cleanup c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:20:50 compute-2 systemd[1]: libpod-conmon-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope: Deactivated successfully.
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.542 2 DEBUG nova.virt.libvirt.vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:19:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:19:53Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.542 2 DEBUG nova.network.os_vif_util [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.543 2 DEBUG nova.network.os_vif_util [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.543 2 DEBUG os_vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7369f952-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.549 2 INFO os_vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f')
Oct 10 10:20:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:50.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:50 compute-2 podman[247843]: 2025-10-10 10:20:50.593008317 +0000 UTC m=+0.041489259 container remove c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.598 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[ab18cc26-656a-457d-b51c-5272a41541c6]: (4, ('Fri Oct 10 10:20:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 (c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3)\nc17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3\nFri Oct 10 10:20:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 (c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3)\nc17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.601 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[928587fb-369d-4c1a-9af4-798714ec91a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.602 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb3e50c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:20:50 compute-2 kernel: tapfb3e50c5-f0: left promiscuous mode
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.618 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[91e506e8-0471-4a19-a08b-095d4d50149e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.645 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0026a8-6b02-42cc-ac63-7d80000867f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.647 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f84d46cb-d8c9-4aec-ad8c-dc91990b3e88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.667 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac4c78d-679f-4fdf-bf58-53d7c80db79e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449623, 'reachable_time': 35322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247876, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.669 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Oct 10 10:20:50 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.669 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[4097473c-9a9b-43e3-ab3b-036e441fad94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Oct 10 10:20:50 compute-2 systemd[1]: run-netns-ovnmeta\x2dfb3e50c5\x2dfe48\x2d4113\x2d87d7\x2d4e11945ac752.mount: Deactivated successfully.
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.927 2 INFO nova.virt.libvirt.driver [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deleting instance files /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15_del
Oct 10 10:20:50 compute-2 nova_compute[235775]: 2025-10-10 10:20:50.929 2 INFO nova.virt.libvirt.driver [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deletion of /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15_del complete
Oct 10 10:20:51 compute-2 nova_compute[235775]: 2025-10-10 10:20:51.025 2 INFO nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 0.73 seconds to destroy the instance on the hypervisor.
Oct 10 10:20:51 compute-2 nova_compute[235775]: 2025-10-10 10:20:51.026 2 DEBUG oslo.service.loopingcall [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Oct 10 10:20:51 compute-2 nova_compute[235775]: 2025-10-10 10:20:51.027 2 DEBUG nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Oct 10 10:20:51 compute-2 nova_compute[235775]: 2025-10-10 10:20:51.027 2 DEBUG nova.network.neutron [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Oct 10 10:20:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:20:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:20:51 compute-2 ceph-mon[74913]: pgmap v1047: 353 pgs: 353 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 5.5 KiB/s wr, 29 op/s
Oct 10 10:20:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.313 2 DEBUG nova.network.neutron [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.319 2 WARNING nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state deleting.
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.333 2 INFO nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 1.31 seconds to deallocate network for instance.
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.385 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.385 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.402 2 DEBUG nova.compute.manager [req-bc8c7d51-389f-40f9-bc98-0c593c7bf1d7 req-0aef2350-6518-4435-b667-891e289f5bfa 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-deleted-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.447 2 DEBUG oslo_concurrency.processutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:20:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:52.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:20:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2442865633' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.916 2 DEBUG oslo_concurrency.processutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.923 2 DEBUG nova.compute.provider_tree [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.940 2 DEBUG nova.scheduler.client.report [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.971 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.996 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Oct 10 10:20:52 compute-2 nova_compute[235775]: 2025-10-10 10:20:52.996 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Oct 10 10:20:53 compute-2 nova_compute[235775]: 2025-10-10 10:20:53.017 2 INFO nova.scheduler.client.report [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance 4fd38b02-f79c-4eb5-9939-6939dda28a15
Oct 10 10:20:53 compute-2 nova_compute[235775]: 2025-10-10 10:20:53.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:53 compute-2 nova_compute[235775]: 2025-10-10 10:20:53.022 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Oct 10 10:20:53 compute-2 nova_compute[235775]: 2025-10-10 10:20:53.083 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:53 compute-2 ceph-mon[74913]: pgmap v1048: 353 pgs: 353 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 5.5 KiB/s wr, 29 op/s
Oct 10 10:20:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2442865633' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:54.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:54 compute-2 nova_compute[235775]: 2025-10-10 10:20:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:20:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:55 compute-2 ceph-mon[74913]: pgmap v1049: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 45 KiB/s rd, 7.7 KiB/s wr, 58 op/s
Oct 10 10:20:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:20:55 compute-2 nova_compute[235775]: 2025-10-10 10:20:55.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:55 compute-2 nova_compute[235775]: 2025-10-10 10:20:55.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:56 compute-2 nova_compute[235775]: 2025-10-10 10:20:56.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:56 compute-2 nova_compute[235775]: 2025-10-10 10:20:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:20:56 compute-2 nova_compute[235775]: 2025-10-10 10:20:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:20:56 compute-2 nova_compute[235775]: 2025-10-10 10:20:56.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:20:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:57 compute-2 ceph-mon[74913]: pgmap v1050: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 7.7 KiB/s wr, 57 op/s
Oct 10 10:20:57 compute-2 nova_compute[235775]: 2025-10-10 10:20:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:57 compute-2 nova_compute[235775]: 2025-10-10 10:20:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:57 compute-2 nova_compute[235775]: 2025-10-10 10:20:57.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:20:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:20:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:58.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.842 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:20:58 compute-2 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:20:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:20:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3082504773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:20:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:20:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:59.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.308 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:20:59 compute-2 ceph-mon[74913]: pgmap v1051: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 7.7 KiB/s wr, 57 op/s
Oct 10 10:20:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3082504773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.474 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.475 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4897MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.475 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.476 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.565 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.565 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:20:59 compute-2 nova_compute[235775]: 2025-10-10 10:20:59.585 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:20:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:20:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:21:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4120952754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.044 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.051 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.067 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.093 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.093 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:21:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4267748783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4120952754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:00 compute-2 sudo[247955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:21:00 compute-2 nova_compute[235775]: 2025-10-10 10:21:00.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:00 compute-2 sudo[247955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:00 compute-2 sudo[247955]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:00.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:01 compute-2 nova_compute[235775]: 2025-10-10 10:21:01.094 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:01.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:01 compute-2 ceph-mon[74913]: pgmap v1052: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 29 op/s
Oct 10 10:21:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1358297504' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:21:01 compute-2 nova_compute[235775]: 2025-10-10 10:21:01.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:02 compute-2 nova_compute[235775]: 2025-10-10 10:21:02.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:02 compute-2 nova_compute[235775]: 2025-10-10 10:21:02.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 10 10:21:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:03 compute-2 nova_compute[235775]: 2025-10-10 10:21:03.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:03.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:03 compute-2 ceph-mon[74913]: pgmap v1053: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 29 op/s
Oct 10 10:21:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:04.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:05.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:05 compute-2 ceph-mon[74913]: pgmap v1054: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.2 KiB/s wr, 29 op/s
Oct 10 10:21:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3705335326' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:05 compute-2 nova_compute[235775]: 2025-10-10 10:21:05.526 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091650.5248919, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Oct 10 10:21:05 compute-2 nova_compute[235775]: 2025-10-10 10:21:05.526 2 INFO nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Stopped (Lifecycle Event)
Oct 10 10:21:05 compute-2 nova_compute[235775]: 2025-10-10 10:21:05.547 2 DEBUG nova.compute.manager [None req-da73437c-51ac-4dca-9c3c-3173ab5bd15c - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Oct 10 10:21:05 compute-2 nova_compute[235775]: 2025-10-10 10:21:05.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/127293249' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:06 compute-2 ceph-mon[74913]: pgmap v1055: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:21:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:06 compute-2 nova_compute[235775]: 2025-10-10 10:21:06.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:06 compute-2 nova_compute[235775]: 2025-10-10 10:21:06.833 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 10 10:21:06 compute-2 nova_compute[235775]: 2025-10-10 10:21:06.851 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 10 10:21:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:07.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:07 compute-2 podman[247988]: 2025-10-10 10:21:07.788366519 +0000 UTC m=+0.059820136 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible)
Oct 10 10:21:07 compute-2 podman[247990]: 2025-10-10 10:21:07.809383483 +0000 UTC m=+0.074854329 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:21:07 compute-2 podman[247989]: 2025-10-10 10:21:07.81055783 +0000 UTC m=+0.079482266 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:21:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:08 compute-2 nova_compute[235775]: 2025-10-10 10:21:08.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:08.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:09 compute-2 ceph-mon[74913]: pgmap v1056: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:21:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:09.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:10 compute-2 nova_compute[235775]: 2025-10-10 10:21:10.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:10.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:11 compute-2 ceph-mon[74913]: pgmap v1057: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:21:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:11.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:12.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:13 compute-2 nova_compute[235775]: 2025-10-10 10:21:13.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:13 compute-2 ceph-mon[74913]: pgmap v1058: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:21:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:13.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:14.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:15 compute-2 ceph-mon[74913]: pgmap v1059: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:21:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:15.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:15 compute-2 nova_compute[235775]: 2025-10-10 10:21:15.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:16.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:17 compute-2 ceph-mon[74913]: pgmap v1060: 353 pgs: 353 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:21:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:21:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/463243596' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:18 compute-2 nova_compute[235775]: 2025-10-10 10:21:18.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:18.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:19.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:19 compute-2 ceph-mon[74913]: pgmap v1061: 353 pgs: 353 active+clean; 88 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 20 op/s
Oct 10 10:21:19 compute-2 podman[248063]: 2025-10-10 10:21:19.794186015 +0000 UTC m=+0.063664909 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:21:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:20 compute-2 nova_compute[235775]: 2025-10-10 10:21:20.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:20.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:20 compute-2 sudo[248083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:21:20 compute-2 sudo[248083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:20 compute-2 sudo[248083]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:21 compute-2 sudo[248109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:21:21 compute-2 sudo[248109]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:21 compute-2 sudo[248109]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:21 compute-2 sudo[248134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:21:21 compute-2 sudo[248134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:21.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:21 compute-2 ceph-mon[74913]: pgmap v1062: 353 pgs: 353 active+clean; 88 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 1.8 MiB/s wr, 19 op/s
Oct 10 10:21:21 compute-2 sudo[248134]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:21:22 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:21:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:23 compute-2 nova_compute[235775]: 2025-10-10 10:21:23.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:23.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:23 compute-2 ceph-mon[74913]: pgmap v1063: 353 pgs: 353 active+clean; 88 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 1.8 MiB/s wr, 20 op/s
Oct 10 10:21:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3399503447' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:21:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1112813609' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 10:21:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:24.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:25 compute-2 ceph-mon[74913]: pgmap v1064: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Oct 10 10:21:25 compute-2 nova_compute[235775]: 2025-10-10 10:21:25.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:26.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:26 compute-2 sudo[248197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:21:26 compute-2 sudo[248197]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:26 compute-2 sudo[248197]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:27.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:27 compute-2 ceph-mon[74913]: pgmap v1065: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Oct 10 10:21:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/213170853' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:21:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/213170853' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:21:27 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:21:27 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:21:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:28 compute-2 nova_compute[235775]: 2025-10-10 10:21:28.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:28 compute-2 ceph-mon[74913]: pgmap v1066: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 104 op/s
Oct 10 10:21:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:28.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:29.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:30 compute-2 nova_compute[235775]: 2025-10-10 10:21:30.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:30.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:30 compute-2 ceph-mon[74913]: pgmap v1067: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 84 op/s
Oct 10 10:21:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:31 compute-2 sshd-session[248195]: Connection reset by 198.235.24.49 port 59788 [preauth]
Oct 10 10:21:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:21:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:32.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:33 compute-2 ceph-mon[74913]: pgmap v1068: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 84 op/s
Oct 10 10:21:33 compute-2 nova_compute[235775]: 2025-10-10 10:21:33.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:33.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.034772) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694034910, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2373, "num_deletes": 251, "total_data_size": 6361445, "memory_usage": 6441408, "flush_reason": "Manual Compaction"}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694058893, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4092731, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31338, "largest_seqno": 33706, "table_properties": {"data_size": 4083132, "index_size": 6029, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20048, "raw_average_key_size": 20, "raw_value_size": 4063987, "raw_average_value_size": 4155, "num_data_blocks": 259, "num_entries": 978, "num_filter_entries": 978, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091492, "oldest_key_time": 1760091492, "file_creation_time": 1760091694, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 24157 microseconds, and 15143 cpu microseconds.
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.058948) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4092731 bytes OK
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.058971) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061897) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061918) EVENT_LOG_v1 {"time_micros": 1760091694061912, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061935) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6350967, prev total WAL file size 6350967, number of live WAL files 2.
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.063215) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3996KB)], [60(11MB)]
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694063238, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16141387, "oldest_snapshot_seqno": -1}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6210 keys, 14032947 bytes, temperature: kUnknown
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694122602, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14032947, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13992364, "index_size": 23961, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15557, "raw_key_size": 159050, "raw_average_key_size": 25, "raw_value_size": 13881461, "raw_average_value_size": 2235, "num_data_blocks": 964, "num_entries": 6210, "num_filter_entries": 6210, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091694, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.122802) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14032947 bytes
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.124699) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 271.6 rd, 236.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 11.5 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 6731, records dropped: 521 output_compression: NoCompression
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.124735) EVENT_LOG_v1 {"time_micros": 1760091694124721, "job": 36, "event": "compaction_finished", "compaction_time_micros": 59427, "compaction_time_cpu_micros": 26458, "output_level": 6, "num_output_files": 1, "total_output_size": 14032947, "num_input_records": 6731, "num_output_records": 6210, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694125706, "job": 36, "event": "table_file_deletion", "file_number": 62}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694127720, "job": 36, "event": "table_file_deletion", "file_number": 60}
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.063117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127795) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:21:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:21:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:34.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:21:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:35 compute-2 ceph-mon[74913]: pgmap v1069: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Oct 10 10:21:35 compute-2 ovn_controller[132503]: 2025-10-10T10:21:35Z|00057|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct 10 10:21:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:35 compute-2 nova_compute[235775]: 2025-10-10 10:21:35.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:36.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:37 compute-2 ceph-mon[74913]: pgmap v1070: 353 pgs: 353 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Oct 10 10:21:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:38 compute-2 nova_compute[235775]: 2025-10-10 10:21:38.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:38.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:38 compute-2 podman[248237]: 2025-10-10 10:21:38.818621384 +0000 UTC m=+0.072067458 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:21:38 compute-2 podman[248235]: 2025-10-10 10:21:38.829455151 +0000 UTC m=+0.090359313 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 10:21:38 compute-2 podman[248236]: 2025-10-10 10:21:38.8534548 +0000 UTC m=+0.110294073 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 10:21:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:39 compute-2 ceph-mon[74913]: pgmap v1071: 353 pgs: 353 active+clean; 113 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 127 op/s
Oct 10 10:21:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:40.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:40 compute-2 nova_compute[235775]: 2025-10-10 10:21:40.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:40 compute-2 sudo[248298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:21:40 compute-2 sudo[248298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:21:40 compute-2 sudo[248298]: pam_unix(sudo:session): session closed for user root
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:41 compute-2 ceph-mon[74913]: pgmap v1072: 353 pgs: 353 active+clean; 113 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 292 KiB/s rd, 2.1 MiB/s wr, 52 op/s
Oct 10 10:21:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:41.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.477 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:21:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.478 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:21:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.478 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:21:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:42.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:43 compute-2 nova_compute[235775]: 2025-10-10 10:21:43.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:43 compute-2 ceph-mon[74913]: pgmap v1073: 353 pgs: 353 active+clean; 113 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 292 KiB/s rd, 2.1 MiB/s wr, 52 op/s
Oct 10 10:21:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:21:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:43.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:21:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:44.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:45 compute-2 ceph-mon[74913]: pgmap v1074: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Oct 10 10:21:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:45.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:45 compute-2 nova_compute[235775]: 2025-10-10 10:21:45.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:45.835 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:21:45 compute-2 nova_compute[235775]: 2025-10-10 10:21:45.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:45 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:45.836 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:46.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:47 compute-2 ceph-mon[74913]: pgmap v1075: 353 pgs: 353 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Oct 10 10:21:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:21:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:47.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:48 compute-2 nova_compute[235775]: 2025-10-10 10:21:48.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:48.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:49 compute-2 ceph-mon[74913]: pgmap v1076: 353 pgs: 353 active+clean; 41 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 339 KiB/s rd, 2.1 MiB/s wr, 83 op/s
Oct 10 10:21:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/991923566' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:49.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:50.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:50 compute-2 nova_compute[235775]: 2025-10-10 10:21:50.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:50 compute-2 podman[248333]: 2025-10-10 10:21:50.775251233 +0000 UTC m=+0.053454962 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:51 compute-2 ceph-mon[74913]: pgmap v1077: 353 pgs: 353 active+clean; 41 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 51 KiB/s wr, 31 op/s
Oct 10 10:21:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:51.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:52 compute-2 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 10:21:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:52 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:21:52.837 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:21:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:53 compute-2 nova_compute[235775]: 2025-10-10 10:21:53.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:53 compute-2 ceph-mon[74913]: pgmap v1078: 353 pgs: 353 active+clean; 41 MiB data, 323 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 51 KiB/s wr, 31 op/s
Oct 10 10:21:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:21:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:21:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:54.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:21:55 compute-2 ceph-mon[74913]: pgmap v1079: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 52 KiB/s wr, 41 op/s
Oct 10 10:21:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:55.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:21:55 compute-2 nova_compute[235775]: 2025-10-10 10:21:55.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:55 compute-2 nova_compute[235775]: 2025-10-10 10:21:55.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:56.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:57 compute-2 ceph-mon[74913]: pgmap v1080: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 29 op/s
Oct 10 10:21:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:21:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:57.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:21:57 compute-2 nova_compute[235775]: 2025-10-10 10:21:57.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:21:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:58.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:21:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:58 compute-2 nova_compute[235775]: 2025-10-10 10:21:58.877 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:21:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:59 compute-2 ceph-mon[74913]: pgmap v1081: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 13 KiB/s wr, 29 op/s
Oct 10 10:21:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:21:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4151122925' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.329 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:21:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:21:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:21:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:59.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.525 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.526 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4903MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.527 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.527 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.599 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.600 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:21:59 compute-2 nova_compute[235775]: 2025-10-10 10:21:59.863 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:21:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:21:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4151122925' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:22:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2003623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.364 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.373 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.395 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:22:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.400 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.400 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:22:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:00.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:00 compute-2 nova_compute[235775]: 2025-10-10 10:22:00.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:00 compute-2 sudo[248407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:22:00 compute-2 sudo[248407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:00 compute-2 sudo[248407]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:01 compute-2 ceph-mon[74913]: pgmap v1082: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.0 KiB/s rd, 596 B/s wr, 11 op/s
Oct 10 10:22:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2003623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:01 compute-2 nova_compute[235775]: 2025-10-10 10:22:01.369 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:01 compute-2 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:01 compute-2 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:01 compute-2 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:01 compute-2 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:22:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:01.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:22:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2212575521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:02.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:03 compute-2 nova_compute[235775]: 2025-10-10 10:22:03.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:03 compute-2 ceph-mon[74913]: pgmap v1083: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.0 KiB/s rd, 597 B/s wr, 11 op/s
Oct 10 10:22:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2094992256' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:03.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:03 compute-2 nova_compute[235775]: 2025-10-10 10:22:03.811 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:04.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:05 compute-2 ceph-mon[74913]: pgmap v1084: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 597 B/s wr, 11 op/s
Oct 10 10:22:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:05.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:05 compute-2 nova_compute[235775]: 2025-10-10 10:22:05.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:06.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:07 compute-2 ceph-mon[74913]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1956938146' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:07.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:08 compute-2 nova_compute[235775]: 2025-10-10 10:22:08.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:08 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3246728595' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:08.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:09 compute-2 ceph-mon[74913]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:09.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:09 compute-2 podman[248443]: 2025-10-10 10:22:09.812499453 +0000 UTC m=+0.069370662 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 10:22:09 compute-2 podman[248441]: 2025-10-10 10:22:09.816568013 +0000 UTC m=+0.080976573 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 10:22:09 compute-2 podman[248442]: 2025-10-10 10:22:09.836737439 +0000 UTC m=+0.106345586 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:22:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:10.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:10 compute-2 nova_compute[235775]: 2025-10-10 10:22:10.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:11 compute-2 ceph-mon[74913]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:11.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:12.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:13 compute-2 nova_compute[235775]: 2025-10-10 10:22:13.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:13 compute-2 ceph-mon[74913]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:13.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:15.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:15 compute-2 ceph-mon[74913]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:15 compute-2 nova_compute[235775]: 2025-10-10 10:22:15.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:22:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:17 compute-2 unix_chkpwd[248516]: password check failed for user (root)
Oct 10 10:22:17 compute-2 sshd-session[248512]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:17.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:17 compute-2 ceph-mon[74913]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:18 compute-2 nova_compute[235775]: 2025-10-10 10:22:18.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:18.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:18 compute-2 sshd-session[248512]: Failed password for root from 193.46.255.33 port 60762 ssh2
Oct 10 10:22:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:19.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:19 compute-2 unix_chkpwd[248519]: password check failed for user (root)
Oct 10 10:22:19 compute-2 ceph-mon[74913]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:20 compute-2 ceph-mon[74913]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:20.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:20 compute-2 nova_compute[235775]: 2025-10-10 10:22:20.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:20 compute-2 sudo[248522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:22:20 compute-2 sudo[248522]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:20 compute-2 sudo[248522]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:21 compute-2 podman[248546]: 2025-10-10 10:22:21.044134308 +0000 UTC m=+0.049105383 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:22:21 compute-2 sshd-session[248512]: Failed password for root from 193.46.255.33 port 60762 ssh2
Oct 10 10:22:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:21.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:21 compute-2 unix_chkpwd[248566]: password check failed for user (root)
Oct 10 10:22:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:22.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:23 compute-2 ceph-mon[74913]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:23 compute-2 nova_compute[235775]: 2025-10-10 10:22:23.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:23 compute-2 sshd-session[248512]: Failed password for root from 193.46.255.33 port 60762 ssh2
Oct 10 10:22:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:24.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:25 compute-2 ceph-mon[74913]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:25.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:25 compute-2 sshd-session[248512]: Received disconnect from 193.46.255.33 port 60762:11:  [preauth]
Oct 10 10:22:25 compute-2 sshd-session[248512]: Disconnected from authenticating user root 193.46.255.33 port 60762 [preauth]
Oct 10 10:22:25 compute-2 sshd-session[248512]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:25 compute-2 nova_compute[235775]: 2025-10-10 10:22:25.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:26 compute-2 unix_chkpwd[248573]: password check failed for user (root)
Oct 10 10:22:26 compute-2 sshd-session[248571]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:22:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:22:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:22:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:22:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:26.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:26 compute-2 sudo[248575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:22:26 compute-2 sudo[248575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:26 compute-2 sudo[248575]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:26 compute-2 sudo[248600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:22:26 compute-2 sudo[248600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:27 compute-2 ceph-mon[74913]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:22:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:22:27 compute-2 sudo[248600]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:22:28 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:28 compute-2 nova_compute[235775]: 2025-10-10 10:22:28.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:28 compute-2 sshd-session[248571]: Failed password for root from 193.46.255.33 port 48730 ssh2
Oct 10 10:22:28 compute-2 unix_chkpwd[248657]: password check failed for user (root)
Oct 10 10:22:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:29 compute-2 ceph-mon[74913]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:29.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.460335) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749460427, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 777, "num_deletes": 250, "total_data_size": 1453649, "memory_usage": 1476816, "flush_reason": "Manual Compaction"}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749468261, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 956064, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33712, "largest_seqno": 34483, "table_properties": {"data_size": 952472, "index_size": 1436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7475, "raw_average_key_size": 17, "raw_value_size": 945175, "raw_average_value_size": 2172, "num_data_blocks": 64, "num_entries": 435, "num_filter_entries": 435, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091694, "oldest_key_time": 1760091694, "file_creation_time": 1760091749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 7965 microseconds, and 4096 cpu microseconds.
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.468313) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 956064 bytes OK
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.468333) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470003) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470024) EVENT_LOG_v1 {"time_micros": 1760091749470017, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470045) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 1449580, prev total WAL file size 1449580, number of live WAL files 2.
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470805) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(933KB)], [63(13MB)]
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749470886, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 14989011, "oldest_snapshot_seqno": -1}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6133 keys, 13814758 bytes, temperature: kUnknown
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749560557, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 13814758, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13774532, "index_size": 23796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15365, "raw_key_size": 159160, "raw_average_key_size": 25, "raw_value_size": 13664685, "raw_average_value_size": 2228, "num_data_blocks": 942, "num_entries": 6133, "num_filter_entries": 6133, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.560803) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 13814758 bytes
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.562372) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.1 rd, 154.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 13.4 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(30.1) write-amplify(14.4) OK, records in: 6645, records dropped: 512 output_compression: NoCompression
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.562404) EVENT_LOG_v1 {"time_micros": 1760091749562390, "job": 38, "event": "compaction_finished", "compaction_time_micros": 89725, "compaction_time_cpu_micros": 37217, "output_level": 6, "num_output_files": 1, "total_output_size": 13814758, "num_input_records": 6645, "num_output_records": 6133, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749562898, "job": 38, "event": "table_file_deletion", "file_number": 65}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749567720, "job": 38, "event": "table_file_deletion", "file_number": 63}
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:22:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:30 compute-2 sshd-session[248571]: Failed password for root from 193.46.255.33 port 48730 ssh2
Oct 10 10:22:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:30 compute-2 ceph-mon[74913]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:22:30 compute-2 unix_chkpwd[248660]: password check failed for user (root)
Oct 10 10:22:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:30.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:30 compute-2 nova_compute[235775]: 2025-10-10 10:22:30.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:31.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:22:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:32 compute-2 sshd-session[248571]: Failed password for root from 193.46.255.33 port 48730 ssh2
Oct 10 10:22:32 compute-2 sudo[248663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:22:32 compute-2 sudo[248663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:32 compute-2 sudo[248663]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:32 compute-2 sshd-session[248571]: Received disconnect from 193.46.255.33 port 48730:11:  [preauth]
Oct 10 10:22:32 compute-2 sshd-session[248571]: Disconnected from authenticating user root 193.46.255.33 port 48730 [preauth]
Oct 10 10:22:32 compute-2 sshd-session[248571]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:32.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:33 compute-2 nova_compute[235775]: 2025-10-10 10:22:33.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:33 compute-2 ceph-mon[74913]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:22:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:22:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:22:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:33.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:33 compute-2 unix_chkpwd[248692]: password check failed for user (root)
Oct 10 10:22:33 compute-2 sshd-session[248689]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:34.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:35 compute-2 ceph-mon[74913]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:35.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:35 compute-2 sshd-session[248689]: Failed password for root from 193.46.255.33 port 28508 ssh2
Oct 10 10:22:35 compute-2 nova_compute[235775]: 2025-10-10 10:22:35.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:36.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:37 compute-2 ceph-mon[74913]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:22:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:37.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:37 compute-2 unix_chkpwd[248697]: password check failed for user (root)
Oct 10 10:22:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:38 compute-2 nova_compute[235775]: 2025-10-10 10:22:38.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:38.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:39 compute-2 ceph-mon[74913]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:39.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:39 compute-2 sshd-session[248689]: Failed password for root from 193.46.255.33 port 28508 ssh2
Oct 10 10:22:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:40.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:40 compute-2 nova_compute[235775]: 2025-10-10 10:22:40.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:40 compute-2 podman[248701]: 2025-10-10 10:22:40.808069406 +0000 UTC m=+0.070320798 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 10:22:40 compute-2 podman[248703]: 2025-10-10 10:22:40.82851417 +0000 UTC m=+0.082618141 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:22:40 compute-2 podman[248702]: 2025-10-10 10:22:40.841773733 +0000 UTC m=+0.101797934 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:41 compute-2 sudo[248765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:22:41 compute-2 sudo[248765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:22:41 compute-2 sudo[248765]: pam_unix(sudo:session): session closed for user root
Oct 10 10:22:41 compute-2 ceph-mon[74913]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:41.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:22:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:22:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:22:41 compute-2 unix_chkpwd[248790]: password check failed for user (root)
Oct 10 10:22:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:22:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:42.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:22:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:43 compute-2 nova_compute[235775]: 2025-10-10 10:22:43.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:43 compute-2 ceph-mon[74913]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:43.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:43 compute-2 sshd-session[248689]: Failed password for root from 193.46.255.33 port 28508 ssh2
Oct 10 10:22:43 compute-2 sshd-session[248689]: Received disconnect from 193.46.255.33 port 28508:11:  [preauth]
Oct 10 10:22:43 compute-2 sshd-session[248793]: Accepted publickey for zuul from 192.168.122.10 port 58248 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:22:43 compute-2 sshd-session[248689]: Disconnected from authenticating user root 193.46.255.33 port 28508 [preauth]
Oct 10 10:22:43 compute-2 sshd-session[248689]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Oct 10 10:22:43 compute-2 systemd-logind[796]: New session 57 of user zuul.
Oct 10 10:22:43 compute-2 systemd[1]: Started Session 57 of User zuul.
Oct 10 10:22:43 compute-2 sshd-session[248793]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:22:43 compute-2 sudo[248797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 10 10:22:43 compute-2 sudo[248797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:22:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:45 compute-2 ceph-mon[74913]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:45.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:45 compute-2 nova_compute[235775]: 2025-10-10 10:22:45.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:22:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:46.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 10:22:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3034678619' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:22:47 compute-2 ceph-mon[74913]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:47 compute-2 ceph-mon[74913]: from='client.26149 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3034678619' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:22:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:47.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:48 compute-2 nova_compute[235775]: 2025-10-10 10:22:48.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.25805 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.16650 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.26158 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.25814 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.16662 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1845402693' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:22:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1224328138' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:22:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:49 compute-2 ceph-mon[74913]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:49.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:50 compute-2 ceph-mon[74913]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:50 compute-2 nova_compute[235775]: 2025-10-10 10:22:50.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:51.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:51 compute-2 podman[249136]: 2025-10-10 10:22:51.789557233 +0000 UTC m=+0.064001696 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:22:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:52 compute-2 ceph-mon[74913]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:52.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:52 compute-2 ovs-vsctl[249187]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 10 10:22:53 compute-2 nova_compute[235775]: 2025-10-10 10:22:53.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:53.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:53 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 10 10:22:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:53 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 10 10:22:53 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 10:22:54 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: cache status {prefix=cache status} (starting...)
Oct 10 10:22:54 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: client ls {prefix=client ls} (starting...)
Oct 10 10:22:54 compute-2 lvm[249544]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 10:22:54 compute-2 lvm[249544]: VG ceph_vg0 finished
Oct 10 10:22:54 compute-2 ceph-mon[74913]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:54.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:54 compute-2 kernel: block dm-0: the capability attribute has been deprecated.
Oct 10 10:22:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: damage ls {prefix=damage ls} (starting...)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1672425821' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump loads {prefix=dump loads} (starting...)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:22:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:22:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:55.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: from='client.26173 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1672425821' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:55 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Oct 10 10:22:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1860179327' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:55 compute-2 nova_compute[235775]: 2025-10-10 10:22:55.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:55 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Oct 10 10:22:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4222220785' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct 10 10:22:56 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct 10 10:22:56 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: ops {prefix=ops} (starting...)
Oct 10 10:22:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Oct 10 10:22:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3187780686' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.25829 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.26185 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.16677 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1860179327' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1736330349' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/588890842' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.25847 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.26209 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.16692 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4222220785' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/661691761' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2495586288' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.25862 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.26227 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3187780686' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2037339908' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1469971866' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:22:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Oct 10 10:22:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2388601897' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:22:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:57 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: session ls {prefix=session ls} (starting...)
Oct 10 10:22:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 10:22:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1390305313' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: status {prefix=status} (starting...)
Oct 10 10:22:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:57.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 10:22:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/778203399' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.16713 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.25874 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2388601897' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.16731 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.26257 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2477553374' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/610000102' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2039649700' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1390305313' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.26281 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4242818798' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2988512648' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/778203399' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:22:57 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Oct 10 10:22:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2332321882' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:57 compute-2 nova_compute[235775]: 2025-10-10 10:22:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3063974506' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1401553685' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/512450568' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/832198139' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.16764 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.25895 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2332321882' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2036159140' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.26314 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.25916 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3421568049' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3063974506' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1401553685' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/512450568' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1035999956' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1328721408' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1488282981' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/832198139' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3760174204' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:22:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:58.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:22:58 compute-2 nova_compute[235775]: 2025-10-10 10:22:58.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:22:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1758553721' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:22:58 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 10:22:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4260196095' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Oct 10 10:22:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/699035021' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:22:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2142363579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.293 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:22:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 10:22:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4087547626' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:22:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:22:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:22:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:59.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.457 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.459 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4776MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.460 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.460 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.533 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.534 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:22:59 compute-2 nova_compute[235775]: 2025-10-10 10:22:59.557 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:22:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Oct 10 10:22:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1467050544' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 10:22:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2700707402' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.26362 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3421045328' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1758553721' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3233252603' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4260196095' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/699035021' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1647674057' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.25967 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2142363579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.16857 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3055579754' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4087547626' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1467050544' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2700707402' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3635718541' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:22:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:22:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:23:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1751714302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 10:23:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2444127767' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.043 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.048 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.083 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.085 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.085 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 10:23:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/670445082' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:10.851897+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 1318912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:11.852074+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 1318912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:12.852255+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:13.852475+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:14.852665+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:15.852881+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 1302528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:16.853121+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 1302528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:17.853314+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:18.853521+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:19.853771+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:20.853919+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 1286144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 70.500732422s of 70.510574341s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:21.854181+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 1277952 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:22.854375+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 1261568 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:23.854584+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 1253376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:24.854803+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 1253376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:25.855001+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879153 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 1245184 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:26.855172+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 1236992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:27.855340+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 1236992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:28.855625+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68984832 unmapped: 1228800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:29.855943+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68984832 unmapped: 1228800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:30.856127+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:31.856358+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:32.856516+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:33.856714+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:34.856937+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:35.857146+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515e000 session 0x55cb222bcf00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:36.857286+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 1204224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:37.857439+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 1204224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:38.857590+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:39.858260+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:40.858397+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:41.858553+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69042176 unmapped: 1171456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:42.858681+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69042176 unmapped: 1171456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:43.858842+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:44.858961+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:45.859114+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:46.859236+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:47.859370+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:48.859504+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:49.860409+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.021001816s of 29.033502579s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:50.886290+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881586 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:51.886564+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69066752 unmapped: 1146880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:52.886907+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:53.887119+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:54.887333+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 1130496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:55.887472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881586 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 1130496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:56.887621+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:57.887760+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:58.887902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:50:59.888073+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69115904 unmapped: 1097728 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:00.888229+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69115904 unmapped: 1097728 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:01.888368+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 1081344 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:02.888512+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 1081344 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:03.888685+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69140480 unmapped: 1073152 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:04.888900+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 1064960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:05.889236+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb22a503c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 1064960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:06.889389+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 1056768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:07.889550+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69165056 unmapped: 1048576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:08.889718+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69165056 unmapped: 1048576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:09.889863+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69173248 unmapped: 1040384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:10.889983+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69173248 unmapped: 1040384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:11.890353+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:12.890505+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:13.890716+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:14.890873+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:15.891033+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:16.891201+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:17.891358+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69206016 unmapped: 1007616 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:18.891489+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69214208 unmapped: 999424 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:19.891622+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:20.891804+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:21.892035+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:22.892196+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb229ed400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:23.892982+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:24.893141+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:25.893287+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.879364014s of 35.892936707s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879813 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69238784 unmapped: 974848 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:26.893445+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69246976 unmapped: 966656 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:27.893598+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69246976 unmapped: 966656 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:28.893756+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 958464 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:29.893935+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 958464 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:30.894229+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69263360 unmapped: 950272 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:31.894336+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69271552 unmapped: 942080 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:32.894501+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69271552 unmapped: 942080 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:33.894691+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:34.894891+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:35.895055+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:36.895186+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69296128 unmapped: 917504 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:37.895311+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69296128 unmapped: 917504 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:38.895483+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:39.895624+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:40.895751+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:41.895925+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69328896 unmapped: 884736 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:42.896103+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69328896 unmapped: 884736 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:43.896275+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:44.896405+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:45.896542+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:46.896670+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69361664 unmapped: 851968 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:47.896791+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69361664 unmapped: 851968 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:48.897051+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69378048 unmapped: 835584 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:49.897293+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69386240 unmapped: 827392 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:50.897469+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69394432 unmapped: 819200 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:51.897589+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69394432 unmapped: 819200 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:52.897687+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69402624 unmapped: 811008 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:53.897913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69402624 unmapped: 811008 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:54.898049+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:55.898242+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:56.898407+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:57.898583+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69419008 unmapped: 794624 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:58.898810+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69427200 unmapped: 786432 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:51:59.899052+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69435392 unmapped: 778240 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:00.899222+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 770048 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:01.899389+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69451776 unmapped: 761856 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:02.899545+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69459968 unmapped: 753664 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:03.899744+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69468160 unmapped: 745472 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:04.899942+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69468160 unmapped: 745472 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:05.900099+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:06.900250+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:07.900437+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:08.900611+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 729088 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:09.900749+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 729088 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:10.900893+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:11.901004+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:12.901129+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:13.901296+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:14.901449+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:15.901622+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:16.901775+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 688128 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:17.901902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 688128 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:18.902059+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 671744 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:19.902235+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69550080 unmapped: 663552 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:20.902368+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69558272 unmapped: 655360 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:21.902970+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69574656 unmapped: 638976 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:22.903153+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69574656 unmapped: 638976 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:23.903329+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:24.903486+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:25.903612+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:26.903762+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:27.903929+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:28.904097+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:29.904271+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:30.904479+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:31.904665+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb24f36960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:32.904809+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69607424 unmapped: 606208 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:33.905127+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 589824 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:34.905303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 581632 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:35.905505+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69640192 unmapped: 573440 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:36.905678+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69648384 unmapped: 565248 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:37.905821+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69656576 unmapped: 557056 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:38.905961+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69656576 unmapped: 557056 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:39.906115+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69664768 unmapped: 548864 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:40.906282+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69664768 unmapped: 548864 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:41.906451+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69672960 unmapped: 540672 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:42.906969+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:43.907424+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:44.907590+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 79.674690247s of 79.682121277s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:45.907704+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880734 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69689344 unmapped: 524288 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:46.907964+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 516096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:47.908100+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 516096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:48.908303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69722112 unmapped: 491520 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:49.908473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69722112 unmapped: 491520 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:50.908624+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 483328 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:51.908780+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 458752 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:52.908934+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69763072 unmapped: 450560 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:53.909106+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69771264 unmapped: 442368 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:54.909234+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69771264 unmapped: 442368 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:55.909351+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:56.909473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:57.909666+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:58.909882+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:52:59.910009+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 417792 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:00.910156+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 417792 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:01.910303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69804032 unmapped: 409600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:02.910425+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69804032 unmapped: 409600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:03.910570+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:04.910755+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:05.910949+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:06.911066+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:07.911196+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:08.911546+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:09.911714+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69836800 unmapped: 376832 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:10.911938+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 360448 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:11.912160+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69861376 unmapped: 352256 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:12.912297+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 344064 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:13.912466+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 344064 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:14.912690+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 335872 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:15.912865+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 335872 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:16.912998+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69894144 unmapped: 319488 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:17.913173+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 311296 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:18.913305+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 311296 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:19.913424+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:20.913651+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:21.913811+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:22.914051+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:23.914248+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:24.914404+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:25.914592+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 278528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:26.914780+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 278528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:27.914941+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 262144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:28.915100+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 262144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:29.915241+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 253952 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:30.915398+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 245760 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:31.915613+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 245760 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:32.915798+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:33.916007+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:34.916177+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:35.916295+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70000640 unmapped: 212992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:36.916501+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70008832 unmapped: 204800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:37.916761+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:38.917024+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:39.917165+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:40.917303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb24a60d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:41.917437+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 188416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:42.917591+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 188416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:43.917770+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 180224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:44.917913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 180224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:45.918051+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70041600 unmapped: 172032 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:46.918196+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 163840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:47.918425+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70057984 unmapped: 155648 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:48.918638+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 147456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:49.918772+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 147456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:50.918941+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 139264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:51.919717+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb23919860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 131072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:52.920168+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 131072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:53.920578+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:54.921000+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:55.921146+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:56.921307+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:57.921449+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:58.922330+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:53:59.922465+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:00.922601+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 106496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:01.922775+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 106496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:02.923410+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:03.923556+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:04.924144+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:05.924316+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70131712 unmapped: 81920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 80.678237915s of 80.685699463s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881655 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:06.924469+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70148096 unmapped: 65536 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:07.924645+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70148096 unmapped: 65536 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:08.924896+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70164480 unmapped: 49152 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:09.925128+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:10.925480+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:11.925630+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:12.925773+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 32768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:13.926064+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 32768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:14.926207+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 24576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:15.926325+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 24576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:16.926506+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:17.926657+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:18.927028+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:19.927202+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 8192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:20.927467+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 8192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:21.927738+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 1040384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:22.927907+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 1032192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:23.928084+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 1024000 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:24.928206+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 1015808 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:25.928412+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 1015808 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:26.928547+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:27.928715+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:28.928932+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:29.929099+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:30.929332+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:31.929484+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:32.929679+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:33.929913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:34.930058+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:35.930188+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:36.930322+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:37.930454+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:38.930645+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:39.930789+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:40.930940+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:41.931142+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 942080 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:42.931257+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 933888 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:43.931505+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 925696 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:44.931646+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:45.931785+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:46.931928+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:47.932034+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:48.932129+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:49.932259+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:50.932432+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:51.932606+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:52.932750+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:53.932905+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:54.933060+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:55.933190+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:56.933347+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:57.935594+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:58.936404+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:54:59.936548+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:00.938472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:01.939475+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:02.939783+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:03.940953+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:04.941301+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:05.942756+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:06.943671+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 843776 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:07.943896+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 835584 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:08.944056+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:09.944214+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:10.946933+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:11.947594+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:12.947738+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:13.947941+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:14.948248+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:15.948508+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:16.948803+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:17.948962+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:18.949277+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 18.97 MB, 0.03 MB/s
                                           Interval WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:19.949532+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:20.949700+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:21.950003+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:22.950278+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:23.950485+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:24.950670+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:25.950986+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb237c54a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:26.951175+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:27.951485+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:28.951691+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:29.951999+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:30.952292+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 696320 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:31.952475+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 688128 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:32.952711+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:33.952947+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:34.953091+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:35.953224+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 671744 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:36.953400+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:37.953581+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:38.953773+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 647168 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:39.953994+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 647168 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 94.012550354s of 94.021888733s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:40.954172+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:41.954440+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886191 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:42.954658+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:43.954903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 630784 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:44.955070+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 630784 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:45.955233+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 622592 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:46.955389+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887112 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 614400 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:47.955512+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 614400 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:48.955651+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:49.955794+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:50.955886+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 598016 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:51.956018+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 589824 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:52.956154+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 589824 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:53.956357+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:54.956579+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:55.956715+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:56.956871+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 573440 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:57.956980+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 573440 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:58.957112+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:55:59.957315+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:00.957547+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:01.957711+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:02.957929+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:03.958095+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:04.958246+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 548864 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:05.958396+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 548864 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:06.958669+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:07.958802+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:08.958976+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:09.959126+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:10.959310+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:11.959452+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:12.959638+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:13.959898+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:14.960081+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 516096 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:15.960291+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 507904 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:16.960532+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 491520 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:17.960763+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:18.961021+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:19.961245+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:20.961462+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:21.961625+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:22.961885+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:23.962131+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:24.962325+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:25.962460+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:26.962594+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:27.962760+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:28.962903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:29.963029+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:30.963195+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 51.020427704s of 51.034717560s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:31.963389+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888033 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:32.963531+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:33.963690+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 360448 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:34.963859+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:35.964047+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:36.964212+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:37.964393+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:38.964517+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:39.964659+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:40.964789+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:41.964962+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:42.965142+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:43.965322+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:44.965453+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:45.965583+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:46.965750+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:47.965903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:48.966092+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:49.966291+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:50.966440+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:51.966613+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 1269760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:52.966952+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 1269760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:53.967255+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1253376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:54.967409+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1253376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:55.967581+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 1245184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:56.967735+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.256875992s of 26.075702667s, submitted: 205
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1220608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:57.967907+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1220608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:58.968135+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1212416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:56:59.968312+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb229ed400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1212416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:00.968480+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 1196032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:01.968612+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889875 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 1187840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:02.968746+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:03.968944+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:04.969238+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:05.969779+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:06.969927+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:07.970783+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:08.971336+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:09.971650+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:10.971905+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1122304 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:11.972164+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:12.972360+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:13.972556+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:14.972706+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:15.972902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:16.973333+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:17.973691+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:18.973961+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:19.974080+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:20.974220+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:21.974546+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:22.976126+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:23.977801+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:24.978072+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:25.978264+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:26.978495+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:27.978644+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:28.978794+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:29.978949+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:30.979064+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:31.979220+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:32.979373+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:33.979541+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:34.979717+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:35.979991+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:36.980192+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:37.980387+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:38.980548+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:39.980691+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:40.980947+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:41.981090+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:42.981279+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:43.982230+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:44.982361+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:45.982519+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:46.982657+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:47.982792+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:48.982948+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:49.983085+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:50.983214+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:51.983337+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:52.983497+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:53.983666+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:54.983788+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:55.983892+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:56.984040+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:57.984205+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:58.984357+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:59.984514+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:00.984659+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:01.984874+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:02.985022+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:03.985195+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:04.985348+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:05.985484+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:06.985664+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:07.985782+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:08.985880+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:09.986034+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:10.986172+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:11.986304+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:12.986448+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:13.986602+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:14.986768+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:15.986890+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:16.987534+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:17.987651+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:18.987924+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:19.988099+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:20.988307+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:21.988546+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:22.988762+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:23.988974+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:24.989265+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:25.989485+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:26.989692+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:27.989986+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:28.990167+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:29.990301+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:30.990555+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:31.990719+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:32.990881+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:33.991043+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:34.991161+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:35.991399+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:36.991588+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:37.991728+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:38.991919+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:39.992039+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:40.992216+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:41.992359+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:42.992591+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:43.992773+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:44.992935+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:45.993058+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:46.993202+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:47.993323+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:48.993546+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:49.993715+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:50.993878+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:51.994007+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:52.994211+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:53.994527+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:54.994746+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:55.994945+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:56.995069+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:57.995257+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:58.995432+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:59.995611+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:00.995780+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:01.995952+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:02.996086+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:03.996216+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:04.996331+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:05.996480+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:06.996720+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:07.996879+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:08.997065+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:09.997188+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:10.997327+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:11.997457+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:12.997594+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:13.998034+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:14.998212+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:15.998331+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:16.998499+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:17.998642+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:18.998955+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:19.999136+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:20.999294+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:21.999447+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:22.999608+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:23.999800+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:24.999983+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:26.000114+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:27.000293+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:28.000406+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:29.000587+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:30.000758+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:31.000939+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:32.001088+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:33.001275+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:34.001553+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:35.001708+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:36.001852+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:37.001984+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:38.002122+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:39.002291+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:40.002461+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:41.002606+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:42.002741+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:43.002889+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:44.003067+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:45.003253+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb257c9860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:46.003666+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:47.003738+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:48.003874+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:49.004103+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:50.004233+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:51.004373+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:52.004544+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:53.004676+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:54.004848+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:55.004999+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:56.005139+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257770e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:57.005283+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:58.005481+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:59.006310+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:00.006495+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:01.006638+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:02.006781+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:03.006885+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:04.007043+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:05.007166+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:06.007291+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:07.007448+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:08.007574+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:09.007701+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:10.007865+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 193.200393677s of 193.215057373s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:11.008007+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:12.008131+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:13.008257+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888693 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:14.008417+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:15.008555+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:16.008680+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:17.008791+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:18.008909+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:19.009020+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:20.009140+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:21.009300+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:22.010192+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:23.010677+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:24.010900+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:25.011908+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:26.012042+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:27.012290+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:28.012618+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:29.012772+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:30.012915+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:31.013535+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:32.013721+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:33.014024+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:34.014364+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:35.014907+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a3c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:36.015092+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:37.015274+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:38.015472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:39.015817+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:40.016036+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:41.016327+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:42.016620+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:43.016903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:44.017165+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:45.017384+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.551929474s of 34.587165833s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:46.017571+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:47.017763+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:48.017979+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891717 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:49.018131+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:50.018271+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:51.018519+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:52.018684+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:53.018870+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:54.019081+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:55.019225+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:56.019371+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:57.019518+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:58.019902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb25234960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:59.020105+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:00.020433+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:01.020709+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:02.020960+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:03.021174+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:04.021412+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:05.021563+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:06.021751+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:07.022054+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:08.022310+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:09.022517+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:10.022808+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:11.023043+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:12.023228+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.263769150s of 27.276128769s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:13.023460+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:14.023758+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:15.023925+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:16.024163+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:17.024368+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:18.024569+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:19.024745+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:20.024862+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:21.024985+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:22.025095+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:23.025334+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:24.026084+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:25.027084+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:26.028856+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:27.029018+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:28.029165+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:29.029353+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:30.029795+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb22622400 session 0x55cb23c641e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb229ed400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:31.029946+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:32.030146+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:33.030319+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:34.030646+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:35.031536+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:36.031938+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:37.032311+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:38.032482+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:39.032667+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:40.033764+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb252354a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:41.033930+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:42.034107+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:43.034363+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:44.034612+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:45.034885+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:46.035092+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:47.035233+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:48.035366+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:49.035485+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:50.035618+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:51.035747+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:52.036131+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:53.036307+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:54.036526+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.978160858s of 41.991683960s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:55.036660+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:56.036872+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:57.037113+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:58.037279+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897504 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:59.037406+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:00.037531+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:01.037657+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:02.037776+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:03.037923+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:04.038096+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:05.038355+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:06.038544+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:07.038692+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:08.038817+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:09.039043+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:10.039250+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:11.039391+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:12.039555+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:13.039719+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:14.039915+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:15.040067+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:16.040314+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:17.040469+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:18.040667+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:19.040814+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:20.041017+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:21.041350+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:22.041515+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:23.041754+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:24.041984+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:25.042111+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:26.042228+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:27.042517+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:28.042697+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:29.043240+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:30.043446+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:31.043618+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:32.045367+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 745472 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:33.046663+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:34.046909+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397f4a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:35.047141+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:36.047304+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:37.048023+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:38.048479+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:39.048615+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:40.049188+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:41.049329+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:42.049470+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:43.049691+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:44.049896+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:45.050025+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:46.050149+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:47.050599+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:48.051463+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.931362152s of 53.942428589s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:49.051867+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:50.052247+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:51.052535+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:52.052707+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:53.053763+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:54.054259+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:55.055116+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:56.055330+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:57.055668+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:58.055913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:59.056184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:00.056327+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:01.057259+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:02.057431+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:03.057674+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:04.057903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:05.058123+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:06.058318+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:07.058603+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:08.058786+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:09.058946+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:10.059114+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.023803711s of 22.031444550s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:11.059297+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:12.059437+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:13.059575+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:14.059749+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:00.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:15.059894+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:16.060018+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:17.060184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:18.060342+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:19.060513+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:20.060653+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:21.060798+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:22.060927+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:23.061024+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:24.061420+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:25.061587+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:26.061726+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb256783c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:27.061893+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:28.061997+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:29.062178+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:30.062371+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:31.062524+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:32.062682+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:33.062842+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:34.063652+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:35.063778+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:36.064415+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:37.064629+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:38.065258+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:39.065402+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.378730774s of 29.382411957s, submitted: 1
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:40.065607+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:41.065739+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:42.065884+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:43.066004+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:44.066379+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:45.066536+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:46.066673+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:47.066810+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:48.067075+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:49.067241+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:50.067391+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:51.067523+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:52.067728+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:53.068004+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:54.068247+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:55.068448+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:56.068623+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:57.068800+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:58.068982+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb256ab0e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:59.069224+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:00.069433+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:01.069688+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:02.069945+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:03.070102+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:04.070280+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:05.070439+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:06.070804+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:07.071111+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:08.071311+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:09.071531+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:10.073288+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:11.073532+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:12.073667+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.485904694s of 32.491119385s, submitted: 1
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:13.073814+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:14.074019+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:15.074209+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:16.074333+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:17.074580+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:18.074936+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:19.075109+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:20.075282+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:21.075523+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:22.075681+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:23.075984+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:24.076465+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:25.076721+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:26.076940+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:27.077307+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:28.077507+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:29.077694+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:30.078003+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 10:23:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/356895798' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:23:00 compute-2 nova_compute[235775]: 2025-10-10 10:23:00.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:31.078206+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:32.078391+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:33.078696+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:34.078935+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:35.079205+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:36.079375+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:37.079627+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:38.080032+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:39.080222+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:40.081433+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:41.081711+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:42.081902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:43.082176+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:44.082346+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:45.082480+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:46.083292+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:47.083431+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:48.083554+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:49.083682+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:50.083769+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:51.083881+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:52.083982+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:53.084046+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:54.084302+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:55.084430+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:56.084569+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:57.085343+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:58.085522+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:59.085745+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:00.086100+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:01.086240+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:02.086363+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:03.086513+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:04.086775+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:05.087010+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:06.087305+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:07.087494+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:08.087633+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:09.087786+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:10.087892+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:11.088012+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:12.088188+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:13.088357+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:14.088531+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:15.088667+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:16.088897+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:17.089035+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:18.089189+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:19.089409+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 5994 writes, 24K keys, 5994 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5994 writes, 1097 syncs, 5.46 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 448 writes, 699 keys, 448 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 448 writes, 217 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:20.089571+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:21.089794+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:22.089992+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:23.090197+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:24.090386+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:25.090587+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:26.090774+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:27.090978+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:28.091136+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:29.091317+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:30.091488+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:31.091619+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:32.091797+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:33.092091+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:34.092373+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:35.092601+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:36.092778+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:37.092980+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:38.093108+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:39.093256+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:40.093374+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:41.093556+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:42.093715+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:43.093891+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:44.094106+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:45.094347+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-mon[74913]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/571390437' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/19710570' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.26431 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1751714302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2444127767' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3224783318' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.26012 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/175764086' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.16902 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3228736852' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/670445082' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2561366827' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:46.094560+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:47.094818+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:48.095137+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb257c9e00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:49.095371+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:50.095537+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:51.095798+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:52.096097+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:53.096331+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:54.096559+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:55.096699+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:56.096937+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:57.097155+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:58.097338+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:59.097477+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:00.097641+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:01.097811+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:02.098070+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:03.098293+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:04.098494+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:05.098731+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:06.098952+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:07.099165+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:08.099366+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 116.097770691s of 116.122451782s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:09.099565+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:10.099710+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:11.099954+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:12.100138+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:13.100319+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:14.100602+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:15.102526+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:16.102668+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:17.104229+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:18.105623+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:19.106902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:20.107194+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:21.108245+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:22.109126+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:23.109961+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:24.110751+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:25.111080+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:26.111656+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:27.111888+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:28.112361+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:29.112868+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:30.113125+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.223886490s of 22.228187561s, submitted: 1
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:31.113564+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:32.113931+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb25216f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:33.114303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:34.114623+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:35.114930+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 1490944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:36.115134+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:37.115465+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:38.115597+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:39.115896+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:40.116073+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:41.116230+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:42.116674+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:43.116984+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:44.117301+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:45.117549+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:46.117713+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.055137634s of 15.676420212s, submitted: 212
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:47.117921+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:48.118106+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257c9680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:49.118308+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904542 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:50.118499+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:51.118673+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:52.118805+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:53.118970+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:54.119184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903951 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:55.119410+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:56.119629+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:57.119963+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:58.120246+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a952c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:59.120457+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:00.120665+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:01.120945+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:02.121124+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:03.121360+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:04.121613+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:05.121800+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:06.121981+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:07.122168+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:08.122387+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:09.122609+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:10.122857+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:11.123050+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:12.123225+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:13.123419+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:14.123624+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:15.123908+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.002698898s of 29.016599655s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:16.124136+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:17.124408+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:18.124590+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:19.124748+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:20.124905+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:21.125075+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:22.125302+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:23.125471+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:24.125641+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:25.125792+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:26.125930+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:27.126029+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:28.126177+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:29.126277+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:30.126399+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:31.126526+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:32.126644+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:33.126799+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:34.126988+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:35.127100+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:36.127212+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:37.127349+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:38.127437+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:39.127567+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:40.127705+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:41.127902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:42.128082+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:43.128281+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:44.128450+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:45.128621+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:46.128814+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:47.129082+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:48.129227+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:49.129394+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:50.129659+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:51.129812+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:52.129955+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:53.130104+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:54.130280+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:55.130417+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:56.130583+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:57.130741+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:58.130892+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:59.131010+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:00.131223+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:01.131377+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:02.131532+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:03.131730+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:04.132101+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:05.132293+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:06.132428+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:07.132569+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:08.132776+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:09.132954+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:10.133071+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:11.133261+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:12.133476+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:13.133649+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:14.133869+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:15.134034+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:16.134191+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:17.134372+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:18.134475+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:19.134690+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:20.134895+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:21.135060+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:22.135241+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24f1ab40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:23.135441+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:24.135618+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:25.135773+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:26.135940+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:27.136095+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:28.136244+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:29.136353+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:30.136528+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:31.136662+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:32.136880+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:33.137090+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:34.137277+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:35.137463+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:36.137596+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.090309143s of 81.102882385s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:37.137747+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:38.137968+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:39.138127+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:40.138339+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905793 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:41.138508+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:42.138651+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:43.138954+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:44.139190+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:45.139430+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:46.139586+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:47.139749+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:48.139948+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:49.140218+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:50.140410+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb25216000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:51.140609+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:52.140867+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:53.141113+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:54.141385+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:55.141545+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:56.141725+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:57.141902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:58.142055+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:59.142202+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:00.142363+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:01.142542+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:02.142697+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:03.142931+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:04.143196+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:05.143345+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442283630s of 28.465816498s, submitted: 2
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:06.143518+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:07.143668+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:08.143926+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:09.144083+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:10.144240+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:11.144400+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:12.144471+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:13.144630+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:14.144809+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:15.145010+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:16.145170+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:17.145329+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:18.145470+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:19.145641+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:20.145803+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb252174a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:21.145954+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.102451324s of 16.118749619s, submitted: 3
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 237568 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:22.146086+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:23.146210+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 146 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c54a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:24.146450+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0xfdb10/0x1b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca64000/0x0/0x4ffc00000, data 0xffae2/0x1b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:25.146682+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926645 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb252b1c00 session 0x55cb23dda000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 65536 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:26.146979+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:27.147203+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:28.147417+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:29.147655+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:30.147877+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929443 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:31.148023+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:32.148178+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:33.148302+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:34.148473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.733337402s of 13.866815567s, submitted: 54
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:35.148606+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930955 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:36.148742+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:37.149015+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:38.149124+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:39.149292+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:40.149473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931459 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:41.149611+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:42.149757+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:43.149926+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:44.150108+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:45.150268+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930277 data_alloc: 218103808 data_used: 53248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:46.150381+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:47.150579+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:48.150754+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:49.150923+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:50.151103+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:51.151297+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:52.151447+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:53.151646+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:54.151910+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:55.152107+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:56.152324+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:57.152483+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:58.152689+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:59.152898+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:00.153146+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:01.153329+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:02.153512+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:03.153689+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:04.153950+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb2515f000 session 0x55cb256aa1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.409894943s of 29.425935745s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb25754c00 session 0x55cb256aa5a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb23c79400 session 0x55cb25679e00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:05.154144+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:06.154381+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1228800 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fca5f000/0x0/0x4ffc00000, data 0x103cdd/0x1bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:07.154535+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a1e960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb2515f000 session 0x55cb25217860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb252b1c00 session 0x55cb250810e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb25755000 session 0x55cb23ddb860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23c79400 session 0x55cb23ddb0e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 11272192 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:08.154683+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 11264000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:09.154910+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 10215424 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:10.155048+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019665 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24c6a1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:11.155216+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:12.155367+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f000 session 0x55cb25080d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:13.155551+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:14.155759+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb252a34a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.058108330s of 10.286009789s, submitted: 78
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257765a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:15.155964+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018571 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:16.156175+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 80191488 unmapped: 8912896 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:17.156323+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:18.156507+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:19.156756+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:20.156987+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1085887 data_alloc: 234881024 data_used: 10084352
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:21.157197+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:22.157324+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:23.157476+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:24.157656+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:25.157816+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1086640 data_alloc: 234881024 data_used: 10084352
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:26.158149+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.044337273s of 12.061671257s, submitted: 5
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:27.158358+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 95715328 unmapped: 4505600 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:28.158519+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96894976 unmapped: 3325952 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:29.158695+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97017856 unmapped: 3203072 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:30.158814+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:31.158942+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:32.159094+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:33.159247+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:34.159430+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:35.159579+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:36.159741+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:37.159898+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:38.160025+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:39.160152+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:40.160301+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:41.160476+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:42.160974+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:43.161123+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:44.161333+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:45.161514+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:46.161668+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:47.161820+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:48.161955+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb2397f2c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb2397e960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:49.162087+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.521245956s of 22.787330627s, submitted: 111
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24c590e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb251034a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96673792 unmapped: 3547136 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:50.162206+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207099 data_alloc: 234881024 data_used: 10989568
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25103680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb25102780
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24a1f860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb24a1ed20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb24a1f0e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb22a681e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:51.162371+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:52.162523+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:53.162610+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397e3c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:54.162784+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:55.162928+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274581 data_alloc: 234881024 data_used: 10989568
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:56.163074+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:57.163191+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22a501e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:58.163349+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb222bcf00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:59.163472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:00.163593+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb257765a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.540281296s of 10.644290924s, submitted: 20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25776f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280396 data_alloc: 234881024 data_used: 10989568
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97419264 unmapped: 16449536 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:01.163742+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:02.163894+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:03.164036+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100958208 unmapped: 12910592 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:04.164617+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:05.165248+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337852 data_alloc: 234881024 data_used: 19390464
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:06.165703+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:07.166232+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:08.166586+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:09.166753+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:10.167004+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339364 data_alloc: 234881024 data_used: 19390464
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622159004s of 10.657759666s, submitted: 8
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:11.167274+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:12.167440+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:13.167668+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:14.167921+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111075328 unmapped: 3850240 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:15.168104+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111214592 unmapped: 3710976 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ead000/0x0/0x4ffc00000, data 0x2b07ea4/0x2bc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:16.168274+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:17.168429+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:18.168567+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:19.168685+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:20.168878+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:21.169057+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.157508850s of 10.318835258s, submitted: 79
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:22.169197+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:23.169412+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257774a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22e632c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:24.169605+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb25235860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:25.169877+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220056 data_alloc: 234881024 data_used: 10858496
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:26.170037+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:27.170189+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:28.170422+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:29.170609+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:30.170778+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb252165a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb251023c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 12288000 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969360 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb23c64d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:31.170964+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93929472 unmapped: 20996096 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:32.171170+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:33.171358+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:34.171555+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:35.171738+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:36.171954+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:37.172098+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:38.172276+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:39.172412+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:40.172565+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:41.172734+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:42.172901+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:43.173079+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:44.173273+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:45.173429+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:46.173567+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:47.173666+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:48.173776+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:49.173894+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:50.174021+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:51.174220+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:52.174426+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:53.174603+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:54.174796+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:55.174915+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:56.175021+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.769847870s of 34.920715332s, submitted: 64
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97558528 unmapped: 28000256 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb256aba40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:57.175155+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:58.175285+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:59.175363+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:00.175473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb256aa960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:01.175606+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:02.175707+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:03.175806+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:04.175983+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:05.176109+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb256aad20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:06.176260+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:07.176394+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:08.176811+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:09.177772+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:10.178534+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:11.179145+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:12.179456+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:13.179597+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:14.180023+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:15.180196+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:16.180631+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99991552 unmapped: 25567232 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:17.180921+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100007936 unmapped: 25550848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.422227859s of 21.509000778s, submitted: 20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:18.181076+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 18571264 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:19.181450+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 17088512 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:20.181641+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108478464 unmapped: 17080320 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319733 data_alloc: 234881024 data_used: 14139392
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:21.181986+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:22.182140+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:23.182524+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:24.182819+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:25.183046+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311406 data_alloc: 234881024 data_used: 14139392
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:26.183305+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:27.183566+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f994a000/0x0/0x4ffc00000, data 0x2077def/0x2132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:28.183722+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:29.183925+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107175936 unmapped: 18382848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.874601364s of 12.195711136s, submitted: 124
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:30.184154+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 17334272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:31.184413+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:32.184632+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:33.184917+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:34.185186+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:35.185379+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:36.185591+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:37.185942+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:38.186138+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9941000/0x0/0x4ffc00000, data 0x2080def/0x213b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:39.186292+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:40.186519+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311694 data_alloc: 234881024 data_used: 14151680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:41.186722+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:42.186912+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:43.187038+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.061926842s of 14.079800606s, submitted: 4
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:44.187230+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 17195008 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:45.187392+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25678f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 17186816 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982200 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:46.187524+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb25824b40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98844672 unmapped: 26714112 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:47.187667+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:48.187850+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:49.188118+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:50.188238+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983712 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:51.188445+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:52.188613+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:53.188770+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:54.188950+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:55.189113+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:56.189282+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:57.189442+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:58.189569+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:59.189710+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:00.189846+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:01.189995+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:02.190120+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:03.190283+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:04.190492+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:05.190652+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:06.190797+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:07.190928+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:08.191066+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:09.191236+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:10.191381+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:11.191542+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:12.191683+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24f1b4a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24f1ad20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f1a3c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a780
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.527706146s of 28.564867020s, submitted: 20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 24657920 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f1a1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:13.191893+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:14.192725+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:15.193308+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:16.193853+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065364 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb249925a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:17.194925+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24992f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98213888 unmapped: 31023104 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24c6be00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:18.195393+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c6a1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faea5000/0x0/0x4ffc00000, data 0xb1cdef/0xbd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:19.196175+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:20.196761+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97427456 unmapped: 31809536 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:21.196962+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1092582 data_alloc: 218103808 data_used: 3399680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99352576 unmapped: 29884416 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:22.197524+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99434496 unmapped: 29802496 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:23.197901+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99467264 unmapped: 29769728 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:24.198586+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:25.198801+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:26.198988+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1131798 data_alloc: 234881024 data_used: 9220096
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:27.199368+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:28.199609+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:29.199892+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:30.200198+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:31.200481+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133926 data_alloc: 234881024 data_used: 9277440
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.591819763s of 18.735073090s, submitted: 24
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107536384 unmapped: 21700608 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:32.200730+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:33.200901+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:34.201079+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:35.201267+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:36.201421+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232606 data_alloc: 234881024 data_used: 9793536
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107724800 unmapped: 21512192 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:37.201579+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106569728 unmapped: 22667264 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:38.201894+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25754800 session 0x55cb25081680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:39.202120+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:40.202293+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:41.202455+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230270 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:42.202922+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:43.203089+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.891391754s of 12.132454872s, submitted: 125
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:44.203325+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed7000/0x0/0x4ffc00000, data 0x16d9dff/0x1795000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:45.203421+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:46.203661+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230430 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:47.203951+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:48.204098+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb24f1ba40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515ec00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515ec00 session 0x55cb24a1bc20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:49.204219+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106627072 unmapped: 22609920 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:50.204355+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f97ba000/0x0/0x4ffc00000, data 0x1df6dff/0x1eb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107798528 unmapped: 21438464 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:51.204511+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296232 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9630000/0x0/0x4ffc00000, data 0x1f80dff/0x203c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:52.204663+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:53.204871+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:54.205060+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:55.205145+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.719927788s of 11.824364662s, submitted: 25
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24a1e960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962d000/0x0/0x4ffc00000, data 0x1f83dff/0x203f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107929600 unmapped: 21307392 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:56.205268+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300762 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107921408 unmapped: 21315584 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:57.205388+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:58.205541+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:59.205673+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:00.205785+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:01.205913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360346 data_alloc: 234881024 data_used: 18640896
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114458624 unmapped: 14778368 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:02.206046+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 14745600 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:03.206769+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:04.206930+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb2397f2c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:05.207045+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:06.207190+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1361578 data_alloc: 234881024 data_used: 18644992
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9629000/0x0/0x4ffc00000, data 0x1f84e22/0x2041000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.852634430s of 11.898006439s, submitted: 19
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:07.207308+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122814464 unmapped: 6422528 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:08.207465+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:09.207547+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:10.207735+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:11.207880+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1493940 data_alloc: 234881024 data_used: 20471808
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:12.208011+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:13.208154+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:14.208330+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb237c5a40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb252350e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:15.208438+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:16.208552+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:17.210931+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:18.211540+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:19.212303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:20.212812+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:21.213539+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:22.214244+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:23.215319+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24c6ad20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257163c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.070486069s of 16.434377670s, submitted: 164
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c5c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:24.215493+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:25.215654+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:26.215930+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:27.216189+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:28.216347+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:29.216640+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:30.216899+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:31.217434+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:32.217643+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:33.217820+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:34.218103+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:35.218341+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:36.218551+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:37.218773+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:38.218961+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:39.219345+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:40.219563+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:41.219962+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:42.220137+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:43.220284+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:44.220466+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:45.220643+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:46.220795+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:47.221040+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:48.221195+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:49.221406+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:50.221570+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.723497391s of 26.753862381s, submitted: 18
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb239192c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23c1fe00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb252a21e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb250814a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c8d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:51.221796+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:52.221924+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:53.222094+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:54.222249+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:55.222422+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e3c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:56.222573+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106676224 unmapped: 34701312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25678f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257774a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:57.222720+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:58.222875+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:59.223040+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105660416 unmapped: 35717120 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:00.223261+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:01.223396+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:02.223616+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:03.223781+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:04.224038+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:05.224229+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:06.224416+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:07.224606+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:08.224801+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:09.225035+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.917535782s of 19.063180923s, submitted: 58
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:10.225210+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:11.225342+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ab4000/0x0/0x4ffc00000, data 0x1afae84/0x1bb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:12.225493+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:13.225639+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:14.225847+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:15.225975+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:16.226105+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:17.226228+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:18.226383+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:19.226586+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s
                                           Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:20.226704+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:21.226850+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326463 data_alloc: 234881024 data_used: 14381056
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:22.227016+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.047369957s of 13.274172783s, submitted: 111
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:23.227133+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:24.227297+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:25.227499+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:26.227737+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326375 data_alloc: 234881024 data_used: 14381056
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:27.227935+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:28.228087+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:29.228261+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:30.228397+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:31.228587+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327543 data_alloc: 234881024 data_used: 14389248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:32.228731+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:33.228959+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:34.229146+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:35.229334+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1f860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d02800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d02800 session 0x55cb250803c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716b40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f1ab40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.387310028s of 13.403597832s, submitted: 5
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:36.229496+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24f1b680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24c6b680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24c6a960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25217860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a61c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397910 data_alloc: 234881024 data_used: 14389248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:37.229743+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:38.229937+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:39.230338+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:40.230518+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:41.230704+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb257c9680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 22839296 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1402283 data_alloc: 234881024 data_used: 14389248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:42.230917+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 22822912 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:43.231040+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121602048 unmapped: 19775488 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:44.231176+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:45.231303+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:46.231424+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:47.231543+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:48.231688+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9188000/0x0/0x4ffc00000, data 0x2424ef6/0x24e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:49.231919+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:50.232039+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:51.232226+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.817728996s of 15.977606773s, submitted: 48
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:52.232470+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 15040512 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:53.232620+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 11034624 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:54.232776+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8654000/0x0/0x4ffc00000, data 0x2f58ef6/0x3018000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:55.232941+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863e000/0x0/0x4ffc00000, data 0x2f6eef6/0x302e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:56.233080+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131252224 unmapped: 10125312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1564155 data_alloc: 234881024 data_used: 24436736
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:57.233203+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131293184 unmapped: 10084352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:58.233283+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:59.233425+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:00.233535+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:01.233734+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560275 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:02.233846+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:03.233975+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:04.234107+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395318985s of 12.603665352s, submitted: 111
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:05.234251+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:06.234338+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560555 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:07.234459+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:08.234608+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:09.234741+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:10.234807+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:11.234968+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560411 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:12.235078+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:13.235184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:14.235350+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:15.235503+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:16.235637+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.019625664s of 12.032996178s, submitted: 5
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561307 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:17.235868+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:18.236014+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:19.236193+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:20.236345+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:21.236469+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8629000/0x0/0x4ffc00000, data 0x2f80ef6/0x3040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560963 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:22.236620+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:23.236774+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:24.237006+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:25.237156+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:26.237296+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561267 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:27.237490+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8627000/0x0/0x4ffc00000, data 0x2f84ef6/0x3044000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.661789894s of 10.690342903s, submitted: 10
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 10240000 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:28.237624+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131170304 unmapped: 10207232 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:29.237746+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:30.237857+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb229ed400 session 0x55cb2397fe00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d66c00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:31.237990+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561187 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:32.238136+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:33.238268+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:34.238468+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130646016 unmapped: 10731520 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861f000/0x0/0x4ffc00000, data 0x2f8def6/0x304d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:35.238575+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130727936 unmapped: 10649600 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:36.238708+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560899 data_alloc: 234881024 data_used: 24440832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:37.238859+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:38.239043+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 10444800 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.915495872s of 11.539477348s, submitted: 233
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:39.239184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:40.239364+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:41.239539+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1d680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb256ab2c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23dda3c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1344411 data_alloc: 234881024 data_used: 14389248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93d1000/0x0/0x4ffc00000, data 0x1b92e84/0x1c50000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:42.239783+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:43.239955+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:44.240139+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:45.240311+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:46.240425+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1345059 data_alloc: 234881024 data_used: 14389248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:47.240583+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:48.240762+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24a1ed20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25102b40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a3b000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.092136383s of 10.210209846s, submitted: 50
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:49.240949+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25102780
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:50.241216+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:51.241523+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:52.241814+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:53.242187+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:54.242477+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:55.242647+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:56.242869+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:57.243051+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:58.243213+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:59.243382+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:00.243583+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:01.243719+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:02.243903+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:03.244107+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:04.244299+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:05.244435+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:06.244607+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:07.244747+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:08.244913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:09.245075+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:10.245262+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:11.245441+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:12.245588+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:13.245722+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:14.245923+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:15.246067+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24f36f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f37860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f363c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:16.246218+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f374a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.982812881s of 27.189233780s, submitted: 65
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e62960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257770e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129727 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:17.246371+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:18.246505+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257761e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:19.246703+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257774a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25776d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257763c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:20.246902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:21.247096+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:22.247266+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166738 data_alloc: 218103808 data_used: 5058560
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:23.247471+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:24.247678+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:25.293996+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113639424 unmapped: 31940608 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:26.294185+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22e62d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.018430710s of 10.130161285s, submitted: 41
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25080780
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 35414016 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25717a40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:27.294315+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:28.294477+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:29.294589+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:30.294883+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:31.296916+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:32.297145+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:33.298096+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:34.298465+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:35.299441+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:36.300264+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:37.300472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:38.300755+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24b1dc20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1c960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:39.300884+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.854639053s of 12.951243401s, submitted: 33
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23c1f4a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb249781e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a61860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c9c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257c85a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:40.301464+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:41.302021+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:42.302169+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1126674 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:43.302373+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb257c8000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:44.302511+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a601e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:45.302740+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a605a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb257770e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:46.302882+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: mgrc ms_handle_reset ms_handle_reset con 0x55cb22623000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/194506248
Oct 10 10:23:00 compute-2 ceph-osd[77423]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/194506248,v1:192.168.122.100:6801/194506248]
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: get_auth_request con 0x55cb2515f400 auth_method 0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: mgrc handle_mgr_configure stats_period=5
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:47.303007+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128267 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 39682048 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:48.303230+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113369088 unmapped: 36413440 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:49.303364+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:50.303523+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:51.303680+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:52.303866+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192259 data_alloc: 234881024 data_used: 9543680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.485980988s of 13.640996933s, submitted: 24
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb25776d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257772c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:53.304896+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb252350e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:54.305114+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:55.305273+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:56.305490+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:57.305661+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:58.305803+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:59.306008+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:00.306173+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:01.306416+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:02.306570+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:03.306758+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:04.307026+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:05.307167+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:06.307326+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:07.307480+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:08.307730+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:09.307940+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:10.308134+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:11.308368+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:12.308554+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:13.308687+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:14.308913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:15.309071+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a503c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a50000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:16.309250+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.271076202s of 23.354894638s, submitted: 36
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25080780
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb24b1d680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24979a40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24f37e00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397f2c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:17.309400+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:18.309545+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:19.309751+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:20.309902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:21.310053+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:22.310216+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:23.310387+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:24.310595+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:25.310722+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23d3e960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e6000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:26.310918+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:27.311125+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182488 data_alloc: 218103808 data_used: 3469312
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 37814272 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:28.311331+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:29.311589+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:30.311776+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:31.311905+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:32.312046+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221704 data_alloc: 234881024 data_used: 9289728
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:33.312217+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:34.312367+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:35.312491+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:36.312598+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:37.312763+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.874525070s of 20.991596222s, submitted: 39
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303724 data_alloc: 234881024 data_used: 9342976
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 26583040 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:38.312904+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f0dff/0x19ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:39.313089+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:40.313255+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:41.313491+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:42.313635+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1373710 data_alloc: 234881024 data_used: 10682368
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:43.313787+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:44.313948+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f972d000/0x0/0x4ffc00000, data 0x1e83dff/0x1f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:45.314115+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:46.314375+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:47.314526+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375014 data_alloc: 234881024 data_used: 10694656
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:48.314665+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:49.314950+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f970f000/0x0/0x4ffc00000, data 0x1ea1dff/0x1f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:50.315102+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:51.315307+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.182755470s of 14.495874405s, submitted: 173
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123781120 unmapped: 26001408 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:52.315467+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23d3f0e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb257163c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1374934 data_alloc: 234881024 data_used: 10694656
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e6000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb2397e960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:53.315671+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:54.315819+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:55.316046+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:56.316184+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:57.316439+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:58.316731+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:59.316938+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:00.317073+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:01.317236+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:02.317487+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:03.317655+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:04.317916+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:05.318034+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:06.318205+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:07.318359+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:08.318506+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:09.318609+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:10.318774+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:11.318939+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:12.319083+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:13.319251+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:14.319409+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:15.319508+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:16.319621+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:17.319778+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb237c5c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb2397e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb23c1f0e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb257772c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.957300186s of 26.056312561s, submitted: 38
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e63680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1ed20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb22a68d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb239183c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:18.319897+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:19.320023+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:20.320180+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:21.320304+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:22.320463+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a1a1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217690 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:23.320608+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115064832 unmapped: 46784512 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:24.320710+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 43900928 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:25.320839+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:26.320960+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:27.321110+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:28.321266+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:29.321499+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:30.321727+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:31.321902+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:32.322026+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 36823040 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:33.322165+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 36814848 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.868372917s of 16.010391235s, submitted: 40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:34.322308+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130957312 unmapped: 30892032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93b2000/0x0/0x4ffc00000, data 0x1deddff/0x1ea9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:35.322456+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131473408 unmapped: 30375936 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb258243c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25825e00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb258252c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816fc00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816fc00 session 0x55cb258250e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:36.322636+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25824f00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131596288 unmapped: 30253056 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:37.322855+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1516856 data_alloc: 234881024 data_used: 18149376
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:38.323023+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24f374a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:39.323167+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8737000/0x0/0x4ffc00000, data 0x2a68dff/0x2b24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24f361e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:40.324313+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:41.325002+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb24f37680
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816f000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:42.325105+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816f000 session 0x55cb24f37a40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1519196 data_alloc: 234881024 data_used: 18149376
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:43.325649+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8712000/0x0/0x4ffc00000, data 0x2a8ce32/0x2b4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:44.325874+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 133881856 unmapped: 27967488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:45.326053+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140607488 unmapped: 21241856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:46.326243+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.067866325s of 12.396329880s, submitted: 138
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:47.326688+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:48.327117+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:49.327482+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:50.327639+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:51.327819+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:52.328027+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:53.328180+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:54.328605+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140648448 unmapped: 21200896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:55.328888+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143187968 unmapped: 18661376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7d06000/0x0/0x4ffc00000, data 0x3498e32/0x3556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:56.329149+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.850776672s of 10.000297546s, submitted: 56
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143392768 unmapped: 18456576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:57.329267+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1680754 data_alloc: 251658240 data_used: 29532160
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:58.329400+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:59.329664+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:00.329854+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:01.330075+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:02.330264+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143597568 unmapped: 18251776 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679274 data_alloc: 251658240 data_used: 29532160
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:03.330473+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:04.330663+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:05.330867+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:06.331007+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:07.331194+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:08.331399+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:09.331552+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:10.331689+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:11.331881+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:12.332010+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:13.332157+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.070764542s of 17.136646271s, submitted: 20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:14.332433+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb257c94a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25678960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:15.332608+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136912896 unmapped: 24936448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb22a51c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:16.332743+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:17.332855+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1445716 data_alloc: 234881024 data_used: 18149376
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:18.332981+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:19.333071+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb252a25a0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a1be00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:20.333191+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136953856 unmapped: 24895488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24992960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:21.333365+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:22.333585+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:23.333732+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:24.333888+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:25.334097+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:26.334247+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:27.334382+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:28.334492+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:29.334640+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:30.334770+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:31.334889+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:32.335063+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:33.335215+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:34.335379+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:35.335554+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:36.335747+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:37.335891+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:38.336078+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:39.336258+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:40.336457+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:41.336626+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:42.336787+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:43.336916+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:44.337692+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:45.337970+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:46.338100+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24b1c960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24c6a1e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 37937152 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24c6ba40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24a941e0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.266139984s of 33.476127625s, submitted: 90
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95c20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb257c8d20
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51e00
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:47.338273+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25717a40
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb25717860
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:48.339354+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:49.340241+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:50.341881+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:51.342009+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:52.343596+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:53.344229+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a60000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:54.344513+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:55.344861+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:56.345321+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:57.345464+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:58.346096+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:59.346269+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:00.346590+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:01.346739+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:02.347198+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:03.347411+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:04.347711+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:05.347858+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.718759537s of 18.834480286s, submitted: 44
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125886464 unmapped: 35962880 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:06.348080+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129695744 unmapped: 32153600 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:07.348189+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:08.348472+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:09.348621+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:10.348866+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:11.349095+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:12.349312+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:13.349488+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:14.349805+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:15.350049+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:16.350254+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704682350s of 10.921176910s, submitted: 89
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24a612c0
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 32022528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:17.350386+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25678960
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:18.350515+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:19.350643+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:20.350899+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:21.351032+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:22.351152+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:23.351359+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:24.351506+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:25.351634+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:26.351818+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:27.352015+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:28.352158+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:29.352330+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:30.352534+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:31.352738+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:32.352854+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:33.352965+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:34.353085+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:35.353199+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:36.353356+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:37.353507+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:38.353617+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:39.353789+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:40.353953+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:41.354087+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:42.354290+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:43.354459+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:44.354618+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:45.354759+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:46.354921+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:47.355062+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:48.355309+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:49.355427+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:50.355561+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:51.355727+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:52.355895+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:53.356008+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:54.356305+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:55.356453+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:56.356574+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:57.356711+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:58.356861+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:59.356993+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:00.357119+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:01.357278+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:02.357411+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:03.357578+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:04.357794+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:05.357951+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:06.358149+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:07.358273+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:08.358456+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:09.358597+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:10.358780+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:11.358913+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:12.359056+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:13.359172+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:14.359315+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:15.359442+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:16.359634+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:17.359876+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:18.360177+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:19.360341+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:20.360462+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:21.360607+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:22.360753+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:23.360869+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:24.361014+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:25.361145+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:26.361270+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:27.361354+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 10:23:00 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125870080 unmapped: 35979264 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:28.361521+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:23:00 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:23:00 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125419520 unmapped: 36429824 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:29.361977+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125558784 unmapped: 36290560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:23:00 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:30.362096+0000)
Oct 10 10:23:00 compute-2 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 10:23:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 10:23:00 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/320126284' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.085 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.086 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.086 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:01 compute-2 sudo[250678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:23:01 compute-2 sudo[250678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:01 compute-2 sudo[250678]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 10:23:01 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/127520907' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:23:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:01.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 10:23:01 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/196607994' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.16911 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.26030 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.26473 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/356895798' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.16926 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/320126284' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.26042 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.16932 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1569461498' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.16944 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2296408099' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.26057 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/127520907' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2579803735' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/196607994' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:23:01 compute-2 nova_compute[235775]: 2025-10-10 10:23:01.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 10:23:01 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1623996306' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Oct 10 10:23:02 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1196706439' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:23:02 compute-2 crontab[250883]: (root) LIST (root)
Oct 10 10:23:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:02.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:02 compute-2 nova_compute[235775]: 2025-10-10 10:23:02.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:02 compute-2 nova_compute[235775]: 2025-10-10 10:23:02.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.26509 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.16953 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.26072 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1623996306' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.26530 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3971028922' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.16968 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3643359259' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.26093 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.26557 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1196706439' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/661113299' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2633175545' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:23:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:03 compute-2 nova_compute[235775]: 2025-10-10 10:23:03.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 10:23:03 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/563292797' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:23:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:03.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 10:23:03 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/781763434' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.16983 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.26111 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.26572 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3336206771' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.26126 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.17001 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.26587 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/914417947' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/563292797' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.17022 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.17025 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/781763434' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:23:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/748618102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2622965518' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/54031956' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1788666203' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1918956957' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:04.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/983784123' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.26614 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.17052 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.26156 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.26638 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3878076498' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2622965518' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.17067 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/54031956' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2189172015' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.26165 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1788666203' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1918956957' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3442204986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/743398584' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2057841537' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3927813275' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/983784123' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:23:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Oct 10 10:23:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1496940488' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Oct 10 10:23:05 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3249578032' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:05.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Oct 10 10:23:05 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1692194911' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:23:05 compute-2 nova_compute[235775]: 2025-10-10 10:23:05.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:05 compute-2 systemd[1]: Starting Hostname Service...
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Oct 10 10:23:05 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1245277696' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.17088 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1496940488' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/627566640' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1961890703' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3007931880' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1496720933' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/166812341' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4102851143' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3249578032' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2734253997' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1358533371' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1178817591' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1692194911' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3908035813' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:23:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/574716574' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:23:06 compute-2 systemd[1]: Started Hostname Service.
Oct 10 10:23:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 10:23:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/362286490' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Oct 10 10:23:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3035571500' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Oct 10 10:23:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/330407203' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:23:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:06.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Oct 10 10:23:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/341481678' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:06 compute-2 ceph-mon[74913]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1245277696' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/362286490' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3456937404' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4270522901' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3276084272' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/496033978' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3035571500' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/330407203' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1993800113' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2553354890' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/190481607' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3152055571' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/341481678' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/214331611' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Oct 10 10:23:07 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2358813671' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:07.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.26773 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2432505021' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3212453945' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1152939535' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2358813671' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1301091987' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.26794 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/77092020' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2749167785' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:23:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/85375489' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:23:08 compute-2 nova_compute[235775]: 2025-10-10 10:23:08.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Oct 10 10:23:08 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/311730347' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:23:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Oct 10 10:23:08 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112620568' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:23:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:08.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26303 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26806 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.17247 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26312 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26318 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26830 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/311730347' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.17265 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.17271 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26327 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.26854 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1010273267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2112620568' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3785517594' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 10:23:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1170104191' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:09.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Oct 10 10:23:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4079194258' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:23:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.17289 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.26351 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.26878 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.17295 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3805709205' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.26366 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1170104191' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1415916768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.26905 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.17328 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/892024631' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/563491473' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4079194258' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3580709824' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3297017605' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Oct 10 10:23:10 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1651511276' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:10.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:10 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:10 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:10 compute-2 nova_compute[235775]: 2025-10-10 10:23:10.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.26384 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.26926 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.17346 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.26399 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.17373 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.26956 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.26405 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/71675879' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1792969292' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1651511276' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:23:11 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:23:11 compute-2 podman[252032]: 2025-10-10 10:23:11.227704997 +0000 UTC m=+0.089609084 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:23:11 compute-2 podman[252034]: 2025-10-10 10:23:11.238563623 +0000 UTC m=+0.095556813 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:23:11 compute-2 podman[252033]: 2025-10-10 10:23:11.254570364 +0000 UTC m=+0.116266614 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 10 10:23:11 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Oct 10 10:23:11 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366803169' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:23:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:11.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:11 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Oct 10 10:23:11 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3216819042' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:23:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:12 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Oct 10 10:23:12 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/804085901' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.17397 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.26989 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/447989855' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/366803169' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3592385461' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3216819042' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/804085901' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1227914892' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:23:12 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Oct 10 10:23:12 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1570390407' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:23:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:12.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:13 compute-2 nova_compute[235775]: 2025-10-10 10:23:13.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:13 compute-2 ceph-mon[74913]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.17463 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.26462 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2970845557' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1570390407' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3588161703' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4049190113' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2769252938' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2793981116' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Oct 10 10:23:13 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/668363965' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 10:23:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:13 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Oct 10 10:23:13 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3098976176' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct 10 10:23:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:14 compute-2 ceph-mon[74913]: from='client.27064 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/668363965' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/638418532' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4223693866' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3098976176' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Oct 10 10:23:14 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1056008335' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 10:23:14 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Oct 10 10:23:14 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1135114031' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct 10 10:23:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:15 compute-2 ceph-mon[74913]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.17517 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.26501 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.27103 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2482729441' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1056008335' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1135114031' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2720008755' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1333227027' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct 10 10:23:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:15.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:15 compute-2 ovs-appctl[253164]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 10:23:15 compute-2 ovs-appctl[253173]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 10:23:15 compute-2 ovs-appctl[253179]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 10:23:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Oct 10 10:23:15 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141273853' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Oct 10 10:23:15 compute-2 nova_compute[235775]: 2025-10-10 10:23:15.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:16 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Oct 10 10:23:16 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2028344608' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.27127 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.17541 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.26522 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.27139 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/856513782' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/693351407' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3141273853' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Oct 10 10:23:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2028344608' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct 10 10:23:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:16.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:17 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Oct 10 10:23:17 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4187206927' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.17565 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.26540 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.17577 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4129244223' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1989748693' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Oct 10 10:23:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3917452810' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct 10 10:23:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:17.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:17 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Oct 10 10:23:17 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3498280741' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct 10 10:23:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:18 compute-2 nova_compute[235775]: 2025-10-10 10:23:18.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.27163 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.26546 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.17589 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1097788986' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4187206927' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.17607 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3498280741' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct 10 10:23:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1781781234' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct 10 10:23:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:18 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 10:23:18 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1213327011' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:23:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Oct 10 10:23:19 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/436430043' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.26570 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.17622 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.26579 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.27196 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1339899947' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.27208 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3595267682' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/929438571' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct 10 10:23:19 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1213327011' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:23:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:19.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:19 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Oct 10 10:23:19 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2054736604' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.27217 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.26597 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/436430043' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2054736604' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1795034779' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/401893655' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Oct 10 10:23:20 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1781797121' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 10:23:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 10:23:20 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3032790570' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:20.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:20 compute-2 nova_compute[235775]: 2025-10-10 10:23:20.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:21 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Oct 10 10:23:21 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3519844817' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 sudo[254835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:23:21 compute-2 sudo[254835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:21 compute-2 sudo[254835]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.17661 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.26603 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.27250 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3929414897' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1781797121' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3032790570' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3539018218' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/151415482' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3519844817' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Oct 10 10:23:21 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2014367048' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:21.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.26633 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.17697 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2014367048' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2054782978' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/928458827' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1759395512' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/833543587' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3180821717' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Oct 10 10:23:22 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/314814720' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:22.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:22 compute-2 podman[254950]: 2025-10-10 10:23:22.815515884 +0000 UTC m=+0.086386490 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 10 10:23:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:23 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Oct 10 10:23:23 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/579627303' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 nova_compute[235775]: 2025-10-10 10:23:23.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:23.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:23 compute-2 ceph-mon[74913]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.27298 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2593878606' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/314814720' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/316001559' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2474021729' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1610891490' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/579627303' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Oct 10 10:23:23 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1812088201' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.26666 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.17736 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.27325 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1812088201' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/333192748' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1639796689' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.27340 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3281277725' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2129940614' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:24.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:24 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Oct 10 10:23:24 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1944349469' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Oct 10 10:23:25 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3153766169' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:25.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.27349 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.17766 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.26687 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1944349469' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2018417856' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3007322319' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3153766169' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:25 compute-2 nova_compute[235775]: 2025-10-10 10:23:25.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 10:23:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3944079894' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.17793 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.26699 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.27382 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.17805 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.27391 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.26705 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3316168477' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3944079894' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2707461924' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1702982701' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1702982701' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:23:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Oct 10 10:23:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3285063725' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:26 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 10:23:27 compute-2 systemd[1]: Starting Time & Date Service...
Oct 10 10:23:27 compute-2 systemd[1]: Started Time & Date Service.
Oct 10 10:23:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:27.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/463257619' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3285063725' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4075409164' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.17856 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.27433 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.26738 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-mon[74913]: from='client.17862 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:27 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Oct 10 10:23:27 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/742710764' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 nova_compute[235775]: 2025-10-10 10:23:28.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:28 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Oct 10 10:23:28 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2080136583' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.27439 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.26744 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2394245946' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3733082418' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/742710764' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2702895973' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1570035963' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2080136583' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:28.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:29 compute-2 ceph-mon[74913]: from='client.17901 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:29 compute-2 ceph-mon[74913]: from='client.27469 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:29 compute-2 ceph-mon[74913]: from='client.17913 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:29 compute-2 ceph-mon[74913]: from='client.26768 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:23:29 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1354894680' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:30 compute-2 ceph-mon[74913]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:30 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2650376232' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:30 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1070543984' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:30 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/344713102' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct 10 10:23:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:30.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:30 compute-2 nova_compute[235775]: 2025-10-10 10:23:30.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:31.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:23:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:32 compute-2 sudo[255797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:23:32 compute-2 sudo[255797]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:32 compute-2 sudo[255797]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:32 compute-2 ceph-mon[74913]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:32 compute-2 sudo[255823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:23:32 compute-2 sudo[255823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:33 compute-2 nova_compute[235775]: 2025-10-10 10:23:33.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:33 compute-2 sudo[255823]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:33.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:23:33 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:23:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:34 compute-2 ceph-mon[74913]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:34.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:35.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:35 compute-2 nova_compute[235775]: 2025-10-10 10:23:35.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:36 compute-2 ceph-mon[74913]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:23:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:38 compute-2 nova_compute[235775]: 2025-10-10 10:23:38.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:38 compute-2 sudo[255884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:23:38 compute-2 sudo[255884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:38 compute-2 sudo[255884]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:39 compute-2 ceph-mon[74913]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:23:39 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:23:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:40 compute-2 nova_compute[235775]: 2025-10-10 10:23:40.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:41 compute-2 ceph-mon[74913]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:23:41 compute-2 sudo[255913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:23:41 compute-2 sudo[255913]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:23:41 compute-2 sudo[255913]: pam_unix(sudo:session): session closed for user root
Oct 10 10:23:41 compute-2 podman[255937]: 2025-10-10 10:23:41.431852413 +0000 UTC m=+0.059724908 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:23:41 compute-2 podman[255939]: 2025-10-10 10:23:41.455672094 +0000 UTC m=+0.077472636 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:23:41 compute-2 podman[255938]: 2025-10-10 10:23:41.468776673 +0000 UTC m=+0.092441455 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:23:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.480 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:23:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.480 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:23:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:23:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:41.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:43 compute-2 nova_compute[235775]: 2025-10-10 10:23:43.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:43 compute-2 ceph-mon[74913]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:23:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:43.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:45 compute-2 ceph-mon[74913]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:45.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:45 compute-2 nova_compute[235775]: 2025-10-10 10:23:45.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:23:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:47.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:47 compute-2 ceph-mon[74913]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:48 compute-2 nova_compute[235775]: 2025-10-10 10:23:48.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:48 compute-2 ceph-mon[74913]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:49.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:50 compute-2 ceph-mon[74913]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:23:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:50.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:23:50 compute-2 nova_compute[235775]: 2025-10-10 10:23:50.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:51.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:52 compute-2 ceph-mon[74913]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:52.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:53 compute-2 nova_compute[235775]: 2025-10-10 10:23:53.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:53.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:53 compute-2 podman[256016]: 2025-10-10 10:23:53.782921834 +0000 UTC m=+0.056388151 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 10:23:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:54 compute-2 ceph-mon[74913]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:23:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:54.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:23:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:55.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:55 compute-2 nova_compute[235775]: 2025-10-10 10:23:55.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:56 compute-2 ceph-mon[74913]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:23:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:56.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:57 compute-2 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 10:23:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:57.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:57 compute-2 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 10:23:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:58 compute-2 nova_compute[235775]: 2025-10-10 10:23:58.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:23:58 compute-2 ceph-mon[74913]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:23:58 compute-2 nova_compute[235775]: 2025-10-10 10:23:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:58 compute-2 nova_compute[235775]: 2025-10-10 10:23:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:58.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:23:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:23:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:23:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:59.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:23:59 compute-2 nova_compute[235775]: 2025-10-10 10:23:59.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:59 compute-2 nova_compute[235775]: 2025-10-10 10:23:59.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:23:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:00 compute-2 ceph-mon[74913]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:24:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:00.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.966 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.967 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.990 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:24:00 compute-2 nova_compute[235775]: 2025-10-10 10:24:00.992 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:24:01 compute-2 sudo[256066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:24:01 compute-2 sudo[256066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:01 compute-2 sudo[256066]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:01.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.509 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.630 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4720MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.699 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.700 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:24:01 compute-2 nova_compute[235775]: 2025-10-10 10:24:01.718 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:24:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:24:01 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/740909772' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:24:02 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3076253319' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:02 compute-2 nova_compute[235775]: 2025-10-10 10:24:02.174 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:24:02 compute-2 nova_compute[235775]: 2025-10-10 10:24:02.179 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:24:02 compute-2 nova_compute[235775]: 2025-10-10 10:24:02.196 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:24:02 compute-2 nova_compute[235775]: 2025-10-10 10:24:02.198 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:24:02 compute-2 nova_compute[235775]: 2025-10-10 10:24:02.198 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:24:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:02.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:02 compute-2 ceph-mon[74913]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3076253319' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:03 compute-2 nova_compute[235775]: 2025-10-10 10:24:03.046 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:03 compute-2 nova_compute[235775]: 2025-10-10 10:24:03.047 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:03 compute-2 nova_compute[235775]: 2025-10-10 10:24:03.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:03.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:04 compute-2 nova_compute[235775]: 2025-10-10 10:24:04.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:04.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:04 compute-2 nova_compute[235775]: 2025-10-10 10:24:04.835 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:04 compute-2 nova_compute[235775]: 2025-10-10 10:24:04.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:24:04 compute-2 ceph-mon[74913]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1898300847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:05.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:05 compute-2 nova_compute[235775]: 2025-10-10 10:24:05.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:05 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2544635714' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:06.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:06 compute-2 ceph-mon[74913]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:07.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:08 compute-2 nova_compute[235775]: 2025-10-10 10:24:08.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:08.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:08 compute-2 ceph-mon[74913]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:09.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1433281411' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:10.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:10 compute-2 nova_compute[235775]: 2025-10-10 10:24:10.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:11 compute-2 ceph-mon[74913]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1672674889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:24:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:11.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:11 compute-2 podman[256127]: 2025-10-10 10:24:11.712550595 +0000 UTC m=+0.058508320 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 10:24:11 compute-2 podman[256125]: 2025-10-10 10:24:11.717477932 +0000 UTC m=+0.067366513 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:24:11 compute-2 podman[256126]: 2025-10-10 10:24:11.739598399 +0000 UTC m=+0.089486920 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 10 10:24:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:12.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:13 compute-2 ceph-mon[74913]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:13 compute-2 nova_compute[235775]: 2025-10-10 10:24:13.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:13.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:14.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:15 compute-2 ceph-mon[74913]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:15 compute-2 nova_compute[235775]: 2025-10-10 10:24:15.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:16.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:17 compute-2 ceph-mon[74913]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:24:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:18 compute-2 nova_compute[235775]: 2025-10-10 10:24:18.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:18 compute-2 sudo[248797]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:18 compute-2 sshd-session[248796]: Received disconnect from 192.168.122.10 port 58248:11: disconnected by user
Oct 10 10:24:18 compute-2 sshd-session[248796]: Disconnected from user zuul 192.168.122.10 port 58248
Oct 10 10:24:18 compute-2 sshd-session[248793]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:24:18 compute-2 systemd-logind[796]: Session 57 logged out. Waiting for processes to exit.
Oct 10 10:24:18 compute-2 systemd[1]: session-57.scope: Deactivated successfully.
Oct 10 10:24:18 compute-2 systemd[1]: session-57.scope: Consumed 2min 45.594s CPU time, 723.5M memory peak, read 279.5M from disk, written 89.3M to disk.
Oct 10 10:24:18 compute-2 systemd-logind[796]: Removed session 57.
Oct 10 10:24:18 compute-2 sshd-session[256198]: Accepted publickey for zuul from 192.168.122.10 port 39682 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:24:18 compute-2 systemd-logind[796]: New session 58 of user zuul.
Oct 10 10:24:18 compute-2 systemd[1]: Started Session 58 of User zuul.
Oct 10 10:24:18 compute-2 sshd-session[256198]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:24:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:18 compute-2 sudo[256202]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-2-2025-10-10-lmanjxo.tar.xz
Oct 10 10:24:18 compute-2 sudo[256202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:24:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:18 compute-2 sudo[256202]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:18 compute-2 sshd-session[256201]: Received disconnect from 192.168.122.10 port 39682:11: disconnected by user
Oct 10 10:24:18 compute-2 sshd-session[256201]: Disconnected from user zuul 192.168.122.10 port 39682
Oct 10 10:24:18 compute-2 sshd-session[256198]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:24:18 compute-2 systemd[1]: session-58.scope: Deactivated successfully.
Oct 10 10:24:18 compute-2 systemd-logind[796]: Session 58 logged out. Waiting for processes to exit.
Oct 10 10:24:18 compute-2 systemd-logind[796]: Removed session 58.
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:19 compute-2 sshd-session[256228]: Accepted publickey for zuul from 192.168.122.10 port 39694 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:24:19 compute-2 systemd-logind[796]: New session 59 of user zuul.
Oct 10 10:24:19 compute-2 systemd[1]: Started Session 59 of User zuul.
Oct 10 10:24:19 compute-2 sshd-session[256228]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:24:19 compute-2 sudo[256232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 10 10:24:19 compute-2 sudo[256232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:24:19 compute-2 sudo[256232]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:19 compute-2 sshd-session[256231]: Received disconnect from 192.168.122.10 port 39694:11: disconnected by user
Oct 10 10:24:19 compute-2 sshd-session[256231]: Disconnected from user zuul 192.168.122.10 port 39694
Oct 10 10:24:19 compute-2 sshd-session[256228]: pam_unix(sshd:session): session closed for user zuul
Oct 10 10:24:19 compute-2 systemd[1]: session-59.scope: Deactivated successfully.
Oct 10 10:24:19 compute-2 systemd-logind[796]: Session 59 logged out. Waiting for processes to exit.
Oct 10 10:24:19 compute-2 systemd-logind[796]: Removed session 59.
Oct 10 10:24:19 compute-2 ceph-mon[74913]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:20.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:20 compute-2 nova_compute[235775]: 2025-10-10 10:24:20.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:21 compute-2 ceph-mon[74913]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:21 compute-2 sudo[256259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:24:21 compute-2 sudo[256259]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:21 compute-2 sudo[256259]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:22.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:23 compute-2 nova_compute[235775]: 2025-10-10 10:24:23.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:23 compute-2 ceph-mon[74913]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:23.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:24 compute-2 podman[256287]: 2025-10-10 10:24:24.792681588 +0000 UTC m=+0.061637730 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 10:24:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:24.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:25 compute-2 ceph-mon[74913]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:25.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:25 compute-2 nova_compute[235775]: 2025-10-10 10:24:25.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:24:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:24:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:24:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:24:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:27 compute-2 ceph-mon[74913]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:24:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:24:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:28 compute-2 nova_compute[235775]: 2025-10-10 10:24:28.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:28.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:29 compute-2 ceph-mon[74913]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:29.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:30 compute-2 nova_compute[235775]: 2025-10-10 10:24:30.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:31 compute-2 ceph-mon[74913]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:24:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:32.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:33 compute-2 nova_compute[235775]: 2025-10-10 10:24:33.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:33 compute-2 ceph-mon[74913]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:33.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:34 compute-2 ceph-mon[74913]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:34.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:35.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:35 compute-2 nova_compute[235775]: 2025-10-10 10:24:35.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:36 compute-2 ceph-mon[74913]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:36.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:37.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:38 compute-2 nova_compute[235775]: 2025-10-10 10:24:38.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:38 compute-2 ceph-mon[74913]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:38 compute-2 sudo[256320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:24:38 compute-2 sudo[256320]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:38 compute-2 sudo[256320]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:38 compute-2 sudo[256346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:24:38 compute-2 sudo[256346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:38.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:39 compute-2 sudo[256346]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:39.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:40 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:24:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:40.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:40 compute-2 nova_compute[235775]: 2025-10-10 10:24:40.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:41 compute-2 ceph-mon[74913]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:24:41 compute-2 ceph-mon[74913]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:24:41 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:24:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:24:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:24:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.482 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:24:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:41 compute-2 sudo[256404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:24:41 compute-2 sudo[256404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:41 compute-2 sudo[256404]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:42 compute-2 podman[256432]: 2025-10-10 10:24:42.774459894 +0000 UTC m=+0.046459765 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:24:42 compute-2 podman[256430]: 2025-10-10 10:24:42.776608073 +0000 UTC m=+0.054497693 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 10:24:42 compute-2 podman[256431]: 2025-10-10 10:24:42.802852922 +0000 UTC m=+0.078415166 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:24:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:42.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:43 compute-2 ceph-mon[74913]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:43 compute-2 nova_compute[235775]: 2025-10-10 10:24:43.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:44 compute-2 sudo[256492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:24:44 compute-2 sudo[256492]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:24:44 compute-2 sudo[256492]: pam_unix(sudo:session): session closed for user root
Oct 10 10:24:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:44.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:45 compute-2 ceph-mon[74913]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:24:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:45.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:45 compute-2 nova_compute[235775]: 2025-10-10 10:24:45.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:46.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:47 compute-2 ceph-mon[74913]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:24:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:24:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:24:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:48 compute-2 nova_compute[235775]: 2025-10-10 10:24:48.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:48.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:49 compute-2 ceph-mon[74913]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:50.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:50 compute-2 nova_compute[235775]: 2025-10-10 10:24:50.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:51 compute-2 ceph-mon[74913]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:24:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:52.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:53 compute-2 nova_compute[235775]: 2025-10-10 10:24:53.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:53 compute-2 ceph-mon[74913]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:53.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.201473) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894201545, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2430, "num_deletes": 508, "total_data_size": 5033625, "memory_usage": 5106832, "flush_reason": "Manual Compaction"}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894221085, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 3255631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34488, "largest_seqno": 36913, "table_properties": {"data_size": 3245062, "index_size": 5975, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3525, "raw_key_size": 29579, "raw_average_key_size": 21, "raw_value_size": 3220417, "raw_average_value_size": 2305, "num_data_blocks": 256, "num_entries": 1397, "num_filter_entries": 1397, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091750, "oldest_key_time": 1760091750, "file_creation_time": 1760091894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 19642 microseconds, and 6414 cpu microseconds.
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.221128) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 3255631 bytes OK
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.221150) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222872) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222886) EVENT_LOG_v1 {"time_micros": 1760091894222882, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 5020871, prev total WAL file size 5020871, number of live WAL files 2.
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.224114) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(3179KB)], [66(13MB)]
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894224165, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17070389, "oldest_snapshot_seqno": -1}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6497 keys, 14858799 bytes, temperature: kUnknown
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894304586, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 14858799, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14814842, "index_size": 26631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 170803, "raw_average_key_size": 26, "raw_value_size": 14697136, "raw_average_value_size": 2262, "num_data_blocks": 1053, "num_entries": 6497, "num_filter_entries": 6497, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.304900) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 14858799 bytes
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.306299) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.0 rd, 184.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 13.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(9.8) write-amplify(4.6) OK, records in: 7530, records dropped: 1033 output_compression: NoCompression
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.306325) EVENT_LOG_v1 {"time_micros": 1760091894306313, "job": 40, "event": "compaction_finished", "compaction_time_micros": 80507, "compaction_time_cpu_micros": 26858, "output_level": 6, "num_output_files": 1, "total_output_size": 14858799, "num_input_records": 7530, "num_output_records": 6497, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894307348, "job": 40, "event": "table_file_deletion", "file_number": 68}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894310714, "job": 40, "event": "table_file_deletion", "file_number": 66}
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.224006) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:24:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:54.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:55 compute-2 ceph-mon[74913]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:24:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:55.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:55 compute-2 podman[256528]: 2025-10-10 10:24:55.785787318 +0000 UTC m=+0.058092496 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 10 10:24:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:55 compute-2 nova_compute[235775]: 2025-10-10 10:24:55.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:56.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:57 compute-2 ceph-mon[74913]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:24:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:57.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:58 compute-2 nova_compute[235775]: 2025-10-10 10:24:58.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:24:58 compute-2 nova_compute[235775]: 2025-10-10 10:24:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:24:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:58.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:24:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:24:59 compute-2 ceph-mon[74913]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:24:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:24:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:24:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:59.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:24:59 compute-2 nova_compute[235775]: 2025-10-10 10:24:59.817 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:24:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:25:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:00.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:00 compute-2 nova_compute[235775]: 2025-10-10 10:25:00.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:01 compute-2 ceph-mon[74913]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:01 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:25:01 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3269636147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.304 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.489 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.490 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4826MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.491 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.491 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:25:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:25:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:01.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.573 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.573 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:25:01 compute-2 nova_compute[235775]: 2025-10-10 10:25:01.590 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:25:01 compute-2 sudo[256596]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:25:01 compute-2 sudo[256596]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:01 compute-2 sudo[256596]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:02 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:25:02 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1111097345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:02 compute-2 nova_compute[235775]: 2025-10-10 10:25:02.036 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:25:02 compute-2 nova_compute[235775]: 2025-10-10 10:25:02.041 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:25:02 compute-2 nova_compute[235775]: 2025-10-10 10:25:02.062 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:25:02 compute-2 nova_compute[235775]: 2025-10-10 10:25:02.064 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:25:02 compute-2 nova_compute[235775]: 2025-10-10 10:25:02.065 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:25:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3269636147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:25:02 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1111097345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:02.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:03 compute-2 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:03 compute-2 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:03 compute-2 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:03 compute-2 nova_compute[235775]: 2025-10-10 10:25:03.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:03 compute-2 ceph-mon[74913]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:03.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:04.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:05 compute-2 ceph-mon[74913]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:05.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:05 compute-2 nova_compute[235775]: 2025-10-10 10:25:05.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:05 compute-2 nova_compute[235775]: 2025-10-10 10:25:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:25:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:05 compute-2 nova_compute[235775]: 2025-10-10 10:25:05.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1184273382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:06.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:07 compute-2 ceph-mon[74913]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3882845651' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:07.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:08 compute-2 nova_compute[235775]: 2025-10-10 10:25:08.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:08.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:09 compute-2 ceph-mon[74913]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:09.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:10.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:10 compute-2 nova_compute[235775]: 2025-10-10 10:25:10.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:11 compute-2 ceph-mon[74913]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:11.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3659408900' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:12.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:13 compute-2 nova_compute[235775]: 2025-10-10 10:25:13.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:13 compute-2 ceph-mon[74913]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1732526091' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:25:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:13.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:13 compute-2 podman[256637]: 2025-10-10 10:25:13.806586663 +0000 UTC m=+0.067245670 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:25:13 compute-2 podman[256635]: 2025-10-10 10:25:13.816914352 +0000 UTC m=+0.093406844 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 10 10:25:13 compute-2 podman[256636]: 2025-10-10 10:25:13.824886318 +0000 UTC m=+0.098867160 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 10:25:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:14.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:15 compute-2 ceph-mon[74913]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:15.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:15 compute-2 nova_compute[235775]: 2025-10-10 10:25:15.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:25:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:17 compute-2 ceph-mon[74913]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:17.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:18 compute-2 nova_compute[235775]: 2025-10-10 10:25:18.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:18.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:19 compute-2 ceph-mon[74913]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:25:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:25:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:25:19 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 7071 writes, 37K keys, 7071 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 7071 writes, 7071 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1582 writes, 8385 keys, 1582 commit groups, 1.0 writes per commit group, ingest: 17.92 MB, 0.03 MB/s
                                           Interval WAL: 1582 writes, 1582 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    136.7      0.40              0.16        20    0.020       0      0       0.0       0.0
                                             L6      1/0   14.17 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.5    182.1    156.3      1.57              0.64        19    0.082    107K    10K       0.0       0.0
                                            Sum      1/0   14.17 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.5    145.1    152.4      1.96              0.80        39    0.050    107K    10K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0    169.4    170.5      0.47              0.22        10    0.047     34K   3591       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    182.1    156.3      1.57              0.64        19    0.082    107K    10K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    137.3      0.40              0.16        19    0.021       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.053, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.29 GB write, 0.12 MB/s write, 0.28 GB read, 0.12 MB/s read, 2.0 seconds
                                           Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.5 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 26.83 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000232 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1610,26.00 MB,8.55389%) FilterBlock(39,311.17 KB,0.0999601%) IndexBlock(39,534.27 KB,0.171626%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:25:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:20.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:25:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:20 compute-2 nova_compute[235775]: 2025-10-10 10:25:20.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:21 compute-2 ceph-mon[74913]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:21.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:21 compute-2 sudo[256706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:25:21 compute-2 sudo[256706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:21 compute-2 sudo[256706]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:25:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:22.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:25:23 compute-2 nova_compute[235775]: 2025-10-10 10:25:23.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:23 compute-2 ceph-mon[74913]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:23.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:24.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:25 compute-2 ceph-mon[74913]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:25.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:25 compute-2 nova_compute[235775]: 2025-10-10 10:25:25.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:26 compute-2 ceph-mon[74913]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:26 compute-2 podman[256736]: 2025-10-10 10:25:26.78075978 +0000 UTC m=+0.053796380 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 10:25:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:26.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:27.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:28 compute-2 nova_compute[235775]: 2025-10-10 10:25:28.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:28.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:29 compute-2 ceph-mon[74913]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:29.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:25:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:30.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:25:30 compute-2 nova_compute[235775]: 2025-10-10 10:25:30.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:31 compute-2 ceph-mon[74913]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:31.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:25:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:32.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:33 compute-2 nova_compute[235775]: 2025-10-10 10:25:33.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:33 compute-2 ceph-mon[74913]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:33.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:34.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:35 compute-2 ceph-mon[74913]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 10:25:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:35.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:36 compute-2 nova_compute[235775]: 2025-10-10 10:25:36.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:36.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:37 compute-2 ceph-mon[74913]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:38 compute-2 nova_compute[235775]: 2025-10-10 10:25:38.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:38.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:39 compute-2 ceph-mon[74913]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:39.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:40.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:41 compute-2 nova_compute[235775]: 2025-10-10 10:25:41.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:41 compute-2 ceph-mon[74913]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.482 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:25:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:25:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:25:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:25:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:41.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:25:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:41 compute-2 sudo[256772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:25:41 compute-2 sudo[256772]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:41 compute-2 sudo[256772]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:42.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:43 compute-2 nova_compute[235775]: 2025-10-10 10:25:43.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:43 compute-2 ceph-mon[74913]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:25:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:43.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:44 compute-2 podman[256802]: 2025-10-10 10:25:44.801635056 +0000 UTC m=+0.065888967 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 10:25:44 compute-2 podman[256800]: 2025-10-10 10:25:44.809535028 +0000 UTC m=+0.074770309 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 10 10:25:44 compute-2 podman[256801]: 2025-10-10 10:25:44.827985737 +0000 UTC m=+0.096096310 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 10:25:44 compute-2 sudo[256865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:25:44 compute-2 sudo[256865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:44 compute-2 sudo[256865]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:44 compute-2 sudo[256891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Oct 10 10:25:44 compute-2 sudo[256891]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:45 compute-2 sudo[256891]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:45 compute-2 ceph-mon[74913]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:45 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:45 compute-2 sudo[256937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:25:45 compute-2 sudo[256937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:45 compute-2 sudo[256937]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:45 compute-2 sudo[256962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:25:45 compute-2 sudo[256962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:45.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:45 compute-2 sudo[256962]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:46 compute-2 nova_compute[235775]: 2025-10-10 10:25:46.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:25:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:25:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:47 compute-2 ceph-mon[74913]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:25:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:25:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:47.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:48 compute-2 nova_compute[235775]: 2025-10-10 10:25:48.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:48.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:49 compute-2 ceph-mon[74913]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:49.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:25:49 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 10K writes, 3043 syncs, 3.52 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2448 writes, 8982 keys, 2448 commit groups, 1.0 writes per commit group, ingest: 9.37 MB, 0.02 MB/s
                                           Interval WAL: 2448 writes, 1024 syncs, 2.39 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:50 compute-2 sudo[257023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:25:50 compute-2 sudo[257023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:25:50 compute-2 sudo[257023]: pam_unix(sudo:session): session closed for user root
Oct 10 10:25:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:50.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:51 compute-2 nova_compute[235775]: 2025-10-10 10:25:51.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:51 compute-2 ceph-mon[74913]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:25:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:51 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:25:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:51.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:52.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:53 compute-2 nova_compute[235775]: 2025-10-10 10:25:53.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:53 compute-2 ceph-mon[74913]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:25:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:53.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:54.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:25:55 compute-2 ceph-mon[74913]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:55.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:56 compute-2 nova_compute[235775]: 2025-10-10 10:25:56.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:56.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:57 compute-2 ceph-mon[74913]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:25:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:57.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:57 compute-2 podman[257056]: 2025-10-10 10:25:57.778727486 +0000 UTC m=+0.058829060 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:25:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:58 compute-2 nova_compute[235775]: 2025-10-10 10:25:58.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:25:58 compute-2 nova_compute[235775]: 2025-10-10 10:25:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:58.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:25:59 compute-2 ceph-mon[74913]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:25:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:25:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:25:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:59.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:25:59 compute-2 nova_compute[235775]: 2025-10-10 10:25:59.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:25:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:00 compute-2 ceph-mon[74913]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:00.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:01 compute-2 nova_compute[235775]: 2025-10-10 10:26:01.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:26:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:01.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:01 compute-2 nova_compute[235775]: 2025-10-10 10:26:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:02 compute-2 sudo[257079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:26:02 compute-2 sudo[257079]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:02 compute-2 sudo[257079]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:02 compute-2 ceph-mon[74913]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.837 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:26:02 compute-2 nova_compute[235775]: 2025-10-10 10:26:02.867 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:26:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:02 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:02.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:03 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:26:03 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/212724700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.284 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.428 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.429 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.430 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.521 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.522 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:26:03 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/212724700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:03.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.691 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.796 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.797 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.809 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.835 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Oct 10 10:26:03 compute-2 nova_compute[235775]: 2025-10-10 10:26:03.851 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:26:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:04 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:26:04 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/470207897' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.314 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.321 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.345 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.347 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.347 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:26:04 compute-2 nova_compute[235775]: 2025-10-10 10:26:04.348 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:04 compute-2 ceph-mon[74913]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:04 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/470207897' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:04.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:05 compute-2 nova_compute[235775]: 2025-10-10 10:26:05.342 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:05.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:05 compute-2 nova_compute[235775]: 2025-10-10 10:26:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:05 compute-2 nova_compute[235775]: 2025-10-10 10:26:05.829 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:05 compute-2 nova_compute[235775]: 2025-10-10 10:26:05.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:26:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:06 compute-2 nova_compute[235775]: 2025-10-10 10:26:06.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:06 compute-2 ceph-mon[74913]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:06.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:07.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3647399066' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:08 compute-2 nova_compute[235775]: 2025-10-10 10:26:08.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:08.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:08 compute-2 ceph-mon[74913]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:08 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1949448778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:09.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:11 compute-2 ceph-mon[74913]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:11 compute-2 nova_compute[235775]: 2025-10-10 10:26:11.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:11.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:12.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:13 compute-2 ceph-mon[74913]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:13 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/943224587' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:13 compute-2 nova_compute[235775]: 2025-10-10 10:26:13.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:13.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:13 compute-2 nova_compute[235775]: 2025-10-10 10:26:13.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:13 compute-2 nova_compute[235775]: 2025-10-10 10:26:13.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Oct 10 10:26:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:14 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2125983211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:15 compute-2 ceph-mon[74913]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:15.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:15 compute-2 podman[257166]: 2025-10-10 10:26:15.789345675 +0000 UTC m=+0.058298254 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 10:26:15 compute-2 podman[257164]: 2025-10-10 10:26:15.79796048 +0000 UTC m=+0.071959510 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 10:26:15 compute-2 podman[257165]: 2025-10-10 10:26:15.823510676 +0000 UTC m=+0.095378668 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 10:26:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:16 compute-2 nova_compute[235775]: 2025-10-10 10:26:16.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:16 compute-2 nova_compute[235775]: 2025-10-10 10:26:16.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:16 compute-2 nova_compute[235775]: 2025-10-10 10:26:16.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Oct 10 10:26:16 compute-2 nova_compute[235775]: 2025-10-10 10:26:16.878 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Oct 10 10:26:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:16.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:17 compute-2 ceph-mon[74913]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:26:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:17.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:18 compute-2 nova_compute[235775]: 2025-10-10 10:26:18.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:18.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:19 compute-2 ceph-mon[74913]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:19.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:20.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:21 compute-2 nova_compute[235775]: 2025-10-10 10:26:21.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:21 compute-2 ceph-mon[74913]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:21.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:22 compute-2 sudo[257233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:26:22 compute-2 sudo[257233]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:22 compute-2 sudo[257233]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:22.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:23 compute-2 ceph-mon[74913]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:23 compute-2 nova_compute[235775]: 2025-10-10 10:26:23.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.593072) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984593101, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1140, "num_deletes": 251, "total_data_size": 2603785, "memory_usage": 2650792, "flush_reason": "Manual Compaction"}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984603375, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1080048, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36919, "largest_seqno": 38053, "table_properties": {"data_size": 1076005, "index_size": 1631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10781, "raw_average_key_size": 20, "raw_value_size": 1067267, "raw_average_value_size": 2068, "num_data_blocks": 70, "num_entries": 516, "num_filter_entries": 516, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091894, "oldest_key_time": 1760091894, "file_creation_time": 1760091984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 10332 microseconds, and 3214 cpu microseconds.
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.603405) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1080048 bytes OK
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.603419) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605033) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605043) EVENT_LOG_v1 {"time_micros": 1760091984605039, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605057) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2598290, prev total WAL file size 2598290, number of live WAL files 2.
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605690) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323535' seq:0, type:0; will stop at (end)
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(1054KB)], [69(14MB)]
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984605715, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15938847, "oldest_snapshot_seqno": -1}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6534 keys, 12459862 bytes, temperature: kUnknown
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984649814, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12459862, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12419315, "index_size": 23091, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 171738, "raw_average_key_size": 26, "raw_value_size": 12304663, "raw_average_value_size": 1883, "num_data_blocks": 907, "num_entries": 6534, "num_filter_entries": 6534, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.650154) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12459862 bytes
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.651399) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 360.4 rd, 281.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 14.2 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(26.3) write-amplify(11.5) OK, records in: 7013, records dropped: 479 output_compression: NoCompression
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.651418) EVENT_LOG_v1 {"time_micros": 1760091984651409, "job": 42, "event": "compaction_finished", "compaction_time_micros": 44222, "compaction_time_cpu_micros": 24421, "output_level": 6, "num_output_files": 1, "total_output_size": 12459862, "num_input_records": 7013, "num_output_records": 6534, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984651711, "job": 42, "event": "table_file_deletion", "file_number": 71}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984654365, "job": 42, "event": "table_file_deletion", "file_number": 69}
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:24.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:25 compute-2 ceph-mon[74913]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:26 compute-2 nova_compute[235775]: 2025-10-10 10:26:26.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:26.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:27 compute-2 ceph-mon[74913]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1280316982' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:26:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1280316982' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:26:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:26:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:27.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:26:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:28 compute-2 nova_compute[235775]: 2025-10-10 10:26:28.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:28 compute-2 podman[257265]: 2025-10-10 10:26:28.781008472 +0000 UTC m=+0.054524473 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:26:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:28.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:29 compute-2 ceph-mon[74913]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:29.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:30.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:31 compute-2 nova_compute[235775]: 2025-10-10 10:26:31.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:31 compute-2 ceph-mon[74913]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:31.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:26:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:32.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:33 compute-2 nova_compute[235775]: 2025-10-10 10:26:33.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:33 compute-2 ceph-mon[74913]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:33.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:34.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:35 compute-2 ceph-mon[74913]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:35.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:36 compute-2 nova_compute[235775]: 2025-10-10 10:26:36.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:36.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:37 compute-2 ceph-mon[74913]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:37.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:38 compute-2 nova_compute[235775]: 2025-10-10 10:26:38.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:38 compute-2 nova_compute[235775]: 2025-10-10 10:26:38.944 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:38.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:39 compute-2 ceph-mon[74913]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:39.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:40.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:41 compute-2 nova_compute[235775]: 2025-10-10 10:26:41.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:41 compute-2 ceph-mon[74913]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:26:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:26:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:26:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:42 compute-2 sudo[257298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:26:42 compute-2 sudo[257298]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:42 compute-2 sudo[257298]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:42.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:43 compute-2 nova_compute[235775]: 2025-10-10 10:26:43.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:43 compute-2 ceph-mon[74913]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:43.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:44.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:45 compute-2 ceph-mon[74913]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:46 compute-2 nova_compute[235775]: 2025-10-10 10:26:46.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:46 compute-2 podman[257328]: 2025-10-10 10:26:46.778729035 +0000 UTC m=+0.058407325 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 10:26:46 compute-2 podman[257330]: 2025-10-10 10:26:46.79080589 +0000 UTC m=+0.062040521 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:26:46 compute-2 podman[257329]: 2025-10-10 10:26:46.805123347 +0000 UTC m=+0.081751650 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 10 10:26:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:46.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:47 compute-2 ceph-mon[74913]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:26:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:48 compute-2 nova_compute[235775]: 2025-10-10 10:26:48.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:49 compute-2 ceph-mon[74913]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:26:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:50 compute-2 sudo[257397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:26:50 compute-2 sudo[257397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:50 compute-2 sudo[257397]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:50 compute-2 sudo[257422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:26:50 compute-2 sudo[257422]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:51.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:51 compute-2 nova_compute[235775]: 2025-10-10 10:26:51.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:51 compute-2 sudo[257422]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:51 compute-2 ceph-mon[74913]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:26:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:26:52 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:26:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:53.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:53 compute-2 nova_compute[235775]: 2025-10-10 10:26:53.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:53 compute-2 ceph-mon[74913]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:26:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:55 compute-2 ceph-mon[74913]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 1 op/s
Oct 10 10:26:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:26:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:55.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:56 compute-2 nova_compute[235775]: 2025-10-10 10:26:56.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:56 compute-2 sudo[257484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:26:56 compute-2 sudo[257484]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:26:56 compute-2 sudo[257484]: pam_unix(sudo:session): session closed for user root
Oct 10 10:26:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:57 compute-2 ceph-mon[74913]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:26:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:26:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:26:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:57.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:58 compute-2 nova_compute[235775]: 2025-10-10 10:26:58.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:26:58 compute-2 nova_compute[235775]: 2025-10-10 10:26:58.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:26:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:26:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:26:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:26:59 compute-2 ceph-mon[74913]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 1 op/s
Oct 10 10:26:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:26:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:26:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:59.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:26:59 compute-2 podman[257513]: 2025-10-10 10:26:59.823685777 +0000 UTC m=+0.088297450 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:26:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:00 compute-2 nova_compute[235775]: 2025-10-10 10:27:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:01.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:01 compute-2 ceph-mon[74913]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:27:01 compute-2 nova_compute[235775]: 2025-10-10 10:27:01.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:01.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:01 compute-2 nova_compute[235775]: 2025-10-10 10:27:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:27:02 compute-2 sudo[257535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:27:02 compute-2 sudo[257535]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:27:02 compute-2 sudo[257535]: pam_unix(sudo:session): session closed for user root
Oct 10 10:27:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:03.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:03 compute-2 ceph-mon[74913]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:27:03 compute-2 nova_compute[235775]: 2025-10-10 10:27:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:03.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:03 compute-2 nova_compute[235775]: 2025-10-10 10:27:03.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:03 compute-2 nova_compute[235775]: 2025-10-10 10:27:03.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.838 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.839 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.839 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.878 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.879 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.879 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.880 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:27:04 compute-2 nova_compute[235775]: 2025-10-10 10:27:04.880 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:05.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:05 compute-2 ceph-mon[74913]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:27:05 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/821806748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.332 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:27:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.481 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.551 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.551 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:27:05 compute-2 nova_compute[235775]: 2025-10-10 10:27:05.586 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:27:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:05.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:27:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2564603434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.085 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.091 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.115 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.117 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:27:06 compute-2 nova_compute[235775]: 2025-10-10 10:27:06.117 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:27:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/821806748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:06 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2564603434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:07 compute-2 ceph-mon[74913]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:07.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:08 compute-2 nova_compute[235775]: 2025-10-10 10:27:08.093 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:08 compute-2 nova_compute[235775]: 2025-10-10 10:27:08.094 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:27:08 compute-2 nova_compute[235775]: 2025-10-10 10:27:08.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:09 compute-2 ceph-mon[74913]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 174 op/s
Oct 10 10:27:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3249927567' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:27:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3915087126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:09.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:10 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3915087126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:11.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:11 compute-2 nova_compute[235775]: 2025-10-10 10:27:11.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:11 compute-2 ceph-mon[74913]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 173 op/s
Oct 10 10:27:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:27:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:11.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:27:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:13.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:13 compute-2 nova_compute[235775]: 2025-10-10 10:27:13.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:13 compute-2 ceph-mon[74913]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 173 op/s
Oct 10 10:27:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:13.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:15.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:15 compute-2 ceph-mon[74913]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 174 op/s
Oct 10 10:27:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4181353496' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:15.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:16 compute-2 nova_compute[235775]: 2025-10-10 10:27:16.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/106958647' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:27:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:17.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:17 compute-2 ceph-mon[74913]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 173 op/s
Oct 10 10:27:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:27:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:17.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:17 compute-2 podman[257620]: 2025-10-10 10:27:17.815855029 +0000 UTC m=+0.082064711 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 10:27:17 compute-2 podman[257624]: 2025-10-10 10:27:17.848070588 +0000 UTC m=+0.093875879 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 10:27:17 compute-2 podman[257621]: 2025-10-10 10:27:17.850983911 +0000 UTC m=+0.112478773 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 10:27:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:18 compute-2 nova_compute[235775]: 2025-10-10 10:27:18.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:19.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:19 compute-2 ceph-mon[74913]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 105 KiB/s rd, 0 B/s wr, 174 op/s
Oct 10 10:27:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:19.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:21.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:21 compute-2 nova_compute[235775]: 2025-10-10 10:27:21.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:21 compute-2 ceph-mon[74913]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:21.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:22 compute-2 sudo[257686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:27:22 compute-2 sudo[257686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:27:22 compute-2 sudo[257686]: pam_unix(sudo:session): session closed for user root
Oct 10 10:27:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:23.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:23 compute-2 nova_compute[235775]: 2025-10-10 10:27:23.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:23 compute-2 ceph-mon[74913]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:23.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:25.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:25 compute-2 ceph-mon[74913]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:26 compute-2 nova_compute[235775]: 2025-10-10 10:27:26.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:27:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:27:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:27:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:27:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:27.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:27 compute-2 ceph-mon[74913]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:27:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:27:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:27.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:28 compute-2 nova_compute[235775]: 2025-10-10 10:27:28.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:29.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:29 compute-2 ceph-mon[74913]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:30 compute-2 nova_compute[235775]: 2025-10-10 10:27:30.466 2 DEBUG oslo_concurrency.processutils [None req-5428eec2-0e0c-4df7-adf7-b6b22d8050c9 e1aed125091e48e09d5990f110c14c39 ec962e275689437d80680ff3ea69c852 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:27:30 compute-2 nova_compute[235775]: 2025-10-10 10:27:30.502 2 DEBUG oslo_concurrency.processutils [None req-5428eec2-0e0c-4df7-adf7-b6b22d8050c9 e1aed125091e48e09d5990f110c14c39 ec962e275689437d80680ff3ea69c852 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:27:30 compute-2 podman[257721]: 2025-10-10 10:27:30.775619942 +0000 UTC m=+0.053569842 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 10 10:27:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:31 compute-2 nova_compute[235775]: 2025-10-10 10:27:31.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:31 compute-2 ceph-mon[74913]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:31.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:27:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:33 compute-2 nova_compute[235775]: 2025-10-10 10:27:33.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:33 compute-2 ceph-mon[74913]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:33.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:27:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:27:35 compute-2 ceph-mon[74913]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:35.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:36 compute-2 nova_compute[235775]: 2025-10-10 10:27:36.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:36 compute-2 nova_compute[235775]: 2025-10-10 10:27:36.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:36 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:36.324 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Oct 10 10:27:36 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:36.327 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Oct 10 10:27:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:37 compute-2 ceph-mon[74913]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:37.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:38 compute-2 nova_compute[235775]: 2025-10-10 10:27:38.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:39 compute-2 ceph-mon[74913]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:39.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:41 compute-2 nova_compute[235775]: 2025-10-10 10:27:41.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.329 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 10 10:27:41 compute-2 ceph-mon[74913]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:27:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:27:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:27:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:41.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:42 compute-2 sudo[257753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:27:42 compute-2 sudo[257753]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:27:42 compute-2 sudo[257753]: pam_unix(sudo:session): session closed for user root
Oct 10 10:27:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:27:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:27:43 compute-2 nova_compute[235775]: 2025-10-10 10:27:43.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:43 compute-2 ceph-mon[74913]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:43.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:45 compute-2 ceph-mon[74913]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:45.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:46 compute-2 nova_compute[235775]: 2025-10-10 10:27:46.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:27:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:47 compute-2 ceph-mon[74913]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:47.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:48 compute-2 nova_compute[235775]: 2025-10-10 10:27:48.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:48 compute-2 ceph-mon[74913]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:48 compute-2 podman[257785]: 2025-10-10 10:27:48.82899107 +0000 UTC m=+0.095317765 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:27:48 compute-2 podman[257786]: 2025-10-10 10:27:48.854728141 +0000 UTC m=+0.116329385 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 10 10:27:48 compute-2 podman[257787]: 2025-10-10 10:27:48.861007221 +0000 UTC m=+0.116094027 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:27:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:49.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:50 compute-2 ceph-mon[74913]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:51.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:51 compute-2 nova_compute[235775]: 2025-10-10 10:27:51.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:51.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:52 compute-2 ceph-mon[74913]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:53.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:53 compute-2 nova_compute[235775]: 2025-10-10 10:27:53.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:53.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:54 compute-2 ceph-mon[74913]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:27:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:55.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:56 compute-2 nova_compute[235775]: 2025-10-10 10:27:56.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:56 compute-2 sudo[257856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:27:56 compute-2 sudo[257856]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:27:56 compute-2 sudo[257856]: pam_unix(sudo:session): session closed for user root
Oct 10 10:27:56 compute-2 sudo[257881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:27:56 compute-2 sudo[257881]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:27:56 compute-2 ceph-mon[74913]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:27:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:27:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:27:57 compute-2 sudo[257881]: pam_unix(sudo:session): session closed for user root
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:27:57 compute-2 ceph-mon[74913]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:27:57 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:27:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:57.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:58 compute-2 nova_compute[235775]: 2025-10-10 10:27:58.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:27:58 compute-2 nova_compute[235775]: 2025-10-10 10:27:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:27:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:27:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:59.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:27:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:27:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:27:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:00 compute-2 ceph-mon[74913]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:28:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:00 compute-2 nova_compute[235775]: 2025-10-10 10:28:00.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:01.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:01 compute-2 nova_compute[235775]: 2025-10-10 10:28:01.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:01 compute-2 podman[257944]: 2025-10-10 10:28:01.811800128 +0000 UTC m=+0.078622902 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 10:28:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:01.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:02 compute-2 sudo[257964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:28:02 compute-2 sudo[257964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:28:02 compute-2 sudo[257964]: pam_unix(sudo:session): session closed for user root
Oct 10 10:28:02 compute-2 ceph-mon[74913]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:28:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:28:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:28:02 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:28:02 compute-2 sudo[257990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:28:02 compute-2 sudo[257990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:28:02 compute-2 sudo[257990]: pam_unix(sudo:session): session closed for user root
Oct 10 10:28:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:28:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:03.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:28:03 compute-2 nova_compute[235775]: 2025-10-10 10:28:03.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:03 compute-2 nova_compute[235775]: 2025-10-10 10:28:03.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:03.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:04 compute-2 ceph-mon[74913]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:04 compute-2 nova_compute[235775]: 2025-10-10 10:28:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:05.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:28:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:05.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.847 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.848 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.878 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:28:05 compute-2 nova_compute[235775]: 2025-10-10 10:28:05.878 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:28:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:06 compute-2 ceph-mon[74913]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:28:06 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:28:06 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2704639440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.353 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.487 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.488 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4846MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.489 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.489 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.538 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.538 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:28:06 compute-2 nova_compute[235775]: 2025-10-10 10:28:06.552 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:28:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:07 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:28:07 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1632890228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:07 compute-2 nova_compute[235775]: 2025-10-10 10:28:07.016 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:28:07 compute-2 nova_compute[235775]: 2025-10-10 10:28:07.021 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:28:07 compute-2 nova_compute[235775]: 2025-10-10 10:28:07.040 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:28:07 compute-2 nova_compute[235775]: 2025-10-10 10:28:07.042 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:28:07 compute-2 nova_compute[235775]: 2025-10-10 10:28:07.042 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:28:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:07.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2704639440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:07 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1632890228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:07.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:08 compute-2 nova_compute[235775]: 2025-10-10 10:28:08.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:08 compute-2 ceph-mon[74913]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:28:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:09 compute-2 nova_compute[235775]: 2025-10-10 10:28:09.009 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:09 compute-2 nova_compute[235775]: 2025-10-10 10:28:09.024 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:09 compute-2 nova_compute[235775]: 2025-10-10 10:28:09.025 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:28:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:09.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:09.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:09 compute-2 sshd[167664]: Timeout before authentication for connection from 115.190.21.38 to 38.102.83.22, pid = 257154
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:10 compute-2 ceph-mon[74913]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:11.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:11 compute-2 nova_compute[235775]: 2025-10-10 10:28:11.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/577030174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:11.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:12 compute-2 ceph-mon[74913]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2943645577' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:13.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:13 compute-2 nova_compute[235775]: 2025-10-10 10:28:13.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:14 compute-2 ceph-mon[74913]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:28:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:15.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:28:15 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3528792367' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:16 compute-2 nova_compute[235775]: 2025-10-10 10:28:16.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:16 compute-2 ceph-mon[74913]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:16 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1813754537' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:28:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:28:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:17.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:17.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:18 compute-2 nova_compute[235775]: 2025-10-10 10:28:18.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:18 compute-2 ceph-mon[74913]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:19.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:19 compute-2 podman[258076]: 2025-10-10 10:28:19.783704805 +0000 UTC m=+0.062533168 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 10:28:19 compute-2 podman[258078]: 2025-10-10 10:28:19.792562987 +0000 UTC m=+0.066374380 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001)
Oct 10 10:28:19 compute-2 podman[258077]: 2025-10-10 10:28:19.8108098 +0000 UTC m=+0.088255050 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 10 10:28:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:20 compute-2 ceph-mon[74913]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:21.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:21 compute-2 nova_compute[235775]: 2025-10-10 10:28:21.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:21.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:22 compute-2 ceph-mon[74913]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:22 compute-2 sudo[258143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:28:22 compute-2 sudo[258143]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:28:22 compute-2 sudo[258143]: pam_unix(sudo:session): session closed for user root
Oct 10 10:28:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:23.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:23 compute-2 nova_compute[235775]: 2025-10-10 10:28:23.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:24 compute-2 ceph-mon[74913]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:25.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:25.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:26 compute-2 nova_compute[235775]: 2025-10-10 10:28:26.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:26 compute-2 ceph-mon[74913]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3744893259' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:28:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/3744893259' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:28:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:27.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.516627) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107516672, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1454, "num_deletes": 251, "total_data_size": 3529015, "memory_usage": 3577552, "flush_reason": "Manual Compaction"}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107532642, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2302543, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38058, "largest_seqno": 39507, "table_properties": {"data_size": 2296439, "index_size": 3367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13187, "raw_average_key_size": 20, "raw_value_size": 2284080, "raw_average_value_size": 3465, "num_data_blocks": 147, "num_entries": 659, "num_filter_entries": 659, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091985, "oldest_key_time": 1760091985, "file_creation_time": 1760092107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 16074 microseconds, and 9680 cpu microseconds.
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.532699) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2302543 bytes OK
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.532723) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534535) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534557) EVENT_LOG_v1 {"time_micros": 1760092107534550, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534580) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 3522258, prev total WAL file size 3522258, number of live WAL files 2.
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.536374) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(2248KB)], [72(11MB)]
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107536424, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 14762405, "oldest_snapshot_seqno": -1}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6677 keys, 12616330 bytes, temperature: kUnknown
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107608701, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12616330, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12574708, "index_size": 23846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16709, "raw_key_size": 175413, "raw_average_key_size": 26, "raw_value_size": 12457292, "raw_average_value_size": 1865, "num_data_blocks": 936, "num_entries": 6677, "num_filter_entries": 6677, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760092107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.609403) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12616330 bytes
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.610854) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.8 rd, 174.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 11.9 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(11.9) write-amplify(5.5) OK, records in: 7193, records dropped: 516 output_compression: NoCompression
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.610885) EVENT_LOG_v1 {"time_micros": 1760092107610871, "job": 44, "event": "compaction_finished", "compaction_time_micros": 72439, "compaction_time_cpu_micros": 49047, "output_level": 6, "num_output_files": 1, "total_output_size": 12616330, "num_input_records": 7193, "num_output_records": 6677, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107612163, "job": 44, "event": "table_file_deletion", "file_number": 74}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107616156, "job": 44, "event": "table_file_deletion", "file_number": 72}
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.536291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:28:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:27.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:28 compute-2 nova_compute[235775]: 2025-10-10 10:28:28.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:28 compute-2 ceph-mon[74913]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:29 compute-2 ceph-mon[74913]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:29.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:31 compute-2 nova_compute[235775]: 2025-10-10 10:28:31.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:32 compute-2 ceph-mon[74913]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:32 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:28:32 compute-2 podman[258178]: 2025-10-10 10:28:32.808301617 +0000 UTC m=+0.072936890 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 10 10:28:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:33.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:33 compute-2 nova_compute[235775]: 2025-10-10 10:28:33.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:34 compute-2 ceph-mon[74913]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:35.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:35.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:36 compute-2 nova_compute[235775]: 2025-10-10 10:28:36.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:36 compute-2 ceph-mon[74913]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:37.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:38 compute-2 nova_compute[235775]: 2025-10-10 10:28:38.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:38 compute-2 ceph-mon[74913]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:39.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:39.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:40 compute-2 ceph-mon[74913]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:41.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:41 compute-2 nova_compute[235775]: 2025-10-10 10:28:41.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:28:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:28:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:28:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:41.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:42 compute-2 ceph-mon[74913]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:42 compute-2 sudo[258209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:28:42 compute-2 sudo[258209]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:28:42 compute-2 sudo[258209]: pam_unix(sudo:session): session closed for user root
Oct 10 10:28:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:43 compute-2 nova_compute[235775]: 2025-10-10 10:28:43.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:44 compute-2 ceph-mon[74913]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:45.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:46 compute-2 nova_compute[235775]: 2025-10-10 10:28:46.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:46 compute-2 ceph-mon[74913]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:28:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:47.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:47.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:48 compute-2 nova_compute[235775]: 2025-10-10 10:28:48.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:48 compute-2 ceph-mon[74913]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:50 compute-2 ceph-mon[74913]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:50 compute-2 podman[258243]: 2025-10-10 10:28:50.811726861 +0000 UTC m=+0.064960575 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 10:28:50 compute-2 podman[258241]: 2025-10-10 10:28:50.811865495 +0000 UTC m=+0.074136538 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 10 10:28:50 compute-2 podman[258242]: 2025-10-10 10:28:50.848529095 +0000 UTC m=+0.113649419 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 10 10:28:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:51 compute-2 nova_compute[235775]: 2025-10-10 10:28:51.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:51.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:52 compute-2 ceph-mon[74913]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:53 compute-2 nova_compute[235775]: 2025-10-10 10:28:53.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:53.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:54 compute-2 ceph-mon[74913]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:28:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:55.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:56 compute-2 nova_compute[235775]: 2025-10-10 10:28:56.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:56 compute-2 ceph-mon[74913]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:28:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:57.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:28:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:57.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:58 compute-2 nova_compute[235775]: 2025-10-10 10:28:58.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:28:58 compute-2 ceph-mon[74913]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:28:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:28:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:59.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:59 compute-2 nova_compute[235775]: 2025-10-10 10:28:59.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:28:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:28:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:28:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:59.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:28:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:00 compute-2 ceph-mon[74913]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:01.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:01 compute-2 nova_compute[235775]: 2025-10-10 10:29:01.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:29:01 compute-2 nova_compute[235775]: 2025-10-10 10:29:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:02 compute-2 sudo[258318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:29:02 compute-2 sudo[258318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:02 compute-2 sudo[258318]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:02 compute-2 sudo[258343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:29:02 compute-2 sudo[258343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:02 compute-2 ceph-mon[74913]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:02 compute-2 sudo[258343]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:03.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:03 compute-2 sudo[258401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:29:03 compute-2 sudo[258401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:03 compute-2 sudo[258401]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:03 compute-2 nova_compute[235775]: 2025-10-10 10:29:03.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:03 compute-2 podman[258425]: 2025-10-10 10:29:03.281690604 +0000 UTC m=+0.054956265 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:29:03 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:29:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:03.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:04 compute-2 ceph-mon[74913]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:04 compute-2 nova_compute[235775]: 2025-10-10 10:29:04.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:04 compute-2 nova_compute[235775]: 2025-10-10 10:29:04.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:29:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:05.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:29:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:05 compute-2 ceph-mon[74913]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:29:05 compute-2 nova_compute[235775]: 2025-10-10 10:29:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:05.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.831 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:29:06 compute-2 nova_compute[235775]: 2025-10-10 10:29:06.831 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:07.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:29:07 compute-2 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:29:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:07.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:08 compute-2 sudo[258471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:29:08 compute-2 sudo[258471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:08 compute-2 sudo[258471]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:08 compute-2 ceph-mon[74913]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:29:08 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:29:08 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:29:08 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2349870666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.296 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.441 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4842MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.675 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.676 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:29:08 compute-2 nova_compute[235775]: 2025-10-10 10:29:08.717 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:29:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:29:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2057179231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:09 compute-2 nova_compute[235775]: 2025-10-10 10:29:09.144 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:29:09 compute-2 nova_compute[235775]: 2025-10-10 10:29:09.151 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:29:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:09.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:09 compute-2 nova_compute[235775]: 2025-10-10 10:29:09.168 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:29:09 compute-2 nova_compute[235775]: 2025-10-10 10:29:09.170 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:29:09 compute-2 nova_compute[235775]: 2025-10-10 10:29:09.170 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:29:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2349870666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2057179231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:10 compute-2 ceph-mon[74913]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:11.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:11 compute-2 nova_compute[235775]: 2025-10-10 10:29:11.171 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:29:11 compute-2 nova_compute[235775]: 2025-10-10 10:29:11.172 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:29:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1966094890' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:11 compute-2 nova_compute[235775]: 2025-10-10 10:29:11.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:11.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:12 compute-2 ceph-mon[74913]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.0 KiB/s rd, 1 op/s
Oct 10 10:29:12 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2534647443' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:13 compute-2 nova_compute[235775]: 2025-10-10 10:29:13.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:13.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:14 compute-2 ceph-mon[74913]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:15.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:15.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:16 compute-2 nova_compute[235775]: 2025-10-10 10:29:16.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:16 compute-2 ceph-mon[74913]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:17.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:17 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:29:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1325988144' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:17.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:18 compute-2 nova_compute[235775]: 2025-10-10 10:29:18.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:18 compute-2 ceph-mon[74913]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2046352366' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:29:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:19.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:29:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:19.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:20 compute-2 ceph-mon[74913]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:21.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:21 compute-2 nova_compute[235775]: 2025-10-10 10:29:21.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:21 compute-2 podman[258536]: 2025-10-10 10:29:21.797625852 +0000 UTC m=+0.058807480 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 10 10:29:21 compute-2 podman[258534]: 2025-10-10 10:29:21.809843312 +0000 UTC m=+0.075784272 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:29:21 compute-2 podman[258535]: 2025-10-10 10:29:21.834623533 +0000 UTC m=+0.100466370 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 10:29:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:21.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:22 compute-2 ceph-mon[74913]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:23.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:23 compute-2 nova_compute[235775]: 2025-10-10 10:29:23.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:23 compute-2 sudo[258599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:29:23 compute-2 sudo[258599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:23 compute-2 sudo[258599]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:23.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:24 compute-2 ceph-mon[74913]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:29:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:25.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:29:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:26 compute-2 nova_compute[235775]: 2025-10-10 10:29:26.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:26 compute-2 ceph-mon[74913]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:29:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:29:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:29:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:29:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:27.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:29:27 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:29:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:27.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:28 compute-2 nova_compute[235775]: 2025-10-10 10:29:28.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:28 compute-2 ceph-mon[74913]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:29.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:29.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:30 compute-2 ceph-mon[74913]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:31 compute-2 nova_compute[235775]: 2025-10-10 10:29:31.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:29:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:31.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:32 compute-2 ceph-mon[74913]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:29:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:29:33 compute-2 nova_compute[235775]: 2025-10-10 10:29:33.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:33 compute-2 podman[258634]: 2025-10-10 10:29:33.771432493 +0000 UTC m=+0.044653657 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 10 10:29:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:33.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:34 compute-2 ceph-mon[74913]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:35.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:36 compute-2 nova_compute[235775]: 2025-10-10 10:29:36.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:36 compute-2 ceph-mon[74913]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:37.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:37.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:38 compute-2 nova_compute[235775]: 2025-10-10 10:29:38.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:38 compute-2 ceph-mon[74913]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:39.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:40 compute-2 ceph-mon[74913]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:41 compute-2 nova_compute[235775]: 2025-10-10 10:29:41.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:29:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:29:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:29:41 compute-2 ceph-mon[74913]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:43.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:43 compute-2 nova_compute[235775]: 2025-10-10 10:29:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:43 compute-2 sudo[258663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:29:43 compute-2 sudo[258663]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:29:43 compute-2 sudo[258663]: pam_unix(sudo:session): session closed for user root
Oct 10 10:29:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:44 compute-2 ceph-mon[74913]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 10:29:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:45.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 10:29:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:45.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:46 compute-2 ceph-mon[74913]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:46 compute-2 nova_compute[235775]: 2025-10-10 10:29:46.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:47.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:47 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:29:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:29:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:47.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:29:48 compute-2 nova_compute[235775]: 2025-10-10 10:29:48.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:48 compute-2 ceph-mon[74913]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:49.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:49.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:50 compute-2 ceph-mon[74913]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:51.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:51 compute-2 nova_compute[235775]: 2025-10-10 10:29:51.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:51.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:52 compute-2 ceph-mon[74913]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:52 compute-2 podman[258699]: 2025-10-10 10:29:52.777703196 +0000 UTC m=+0.051922279 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 10:29:52 compute-2 podman[258697]: 2025-10-10 10:29:52.777789338 +0000 UTC m=+0.060299646 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:29:52 compute-2 podman[258698]: 2025-10-10 10:29:52.802764146 +0000 UTC m=+0.082379501 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 10:29:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:53.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:53 compute-2 nova_compute[235775]: 2025-10-10 10:29:53.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:54 compute-2 ceph-mon[74913]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:29:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:55.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:29:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:29:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:55.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:56 compute-2 nova_compute[235775]: 2025-10-10 10:29:56.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:56 compute-2 ceph-mon[74913]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:57.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:57.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:58 compute-2 nova_compute[235775]: 2025-10-10 10:29:58.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:29:58 compute-2 ceph-mon[74913]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:29:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:29:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:59.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:29:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:29:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:29:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:59.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:00 compute-2 ceph-mon[74913]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:00 compute-2 ceph-mon[74913]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Oct 10 10:30:00 compute-2 ceph-mon[74913]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Oct 10 10:30:00 compute-2 ceph-mon[74913]:     daemon nfs.cephfs.2.0.compute-0.ruydzo on compute-0 is in error state
Oct 10 10:30:00 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:00 compute-2 nova_compute[235775]: 2025-10-10 10:30:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:00 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:00 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:01 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:01 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:01.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:01 compute-2 nova_compute[235775]: 2025-10-10 10:30:01.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:01 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:30:01 compute-2 nova_compute[235775]: 2025-10-10 10:30:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:01 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:01 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:01 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:02 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:30:02 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:01.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:30:02 compute-2 ceph-mon[74913]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:02 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:02 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:03 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:03 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:30:03 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:03.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:30:03 compute-2 nova_compute[235775]: 2025-10-10 10:30:03.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:03 compute-2 sudo[258773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:30:03 compute-2 sudo[258773]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:03 compute-2 sudo[258773]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:03 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:03 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:04 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:04 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:04 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:04.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:04 compute-2 ceph-mon[74913]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:04 compute-2 podman[258799]: 2025-10-10 10:30:04.80010275 +0000 UTC m=+0.068654113 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 10:30:04 compute-2 nova_compute[235775]: 2025-10-10 10:30:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:04 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:04 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:05 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:05 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:05 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:05.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:05 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:05 compute-2 nova_compute[235775]: 2025-10-10 10:30:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:05 compute-2 nova_compute[235775]: 2025-10-10 10:30:05.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:05 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:05 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:06 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:06 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:06 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:06.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:06 compute-2 nova_compute[235775]: 2025-10-10 10:30:06.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:06 compute-2 ceph-mon[74913]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:06 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:06 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:07 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:07 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:07 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:07.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.846 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.874 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.874 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.875 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Oct 10 10:30:07 compute-2 nova_compute[235775]: 2025-10-10 10:30:07.876 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:30:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:07 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:07 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:08 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:08 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:08 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:08.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:08 compute-2 sudo[258841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:30:08 compute-2 sudo[258841]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:08 compute-2 sudo[258841]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:08 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:30:08 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1914201702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:08 compute-2 sudo[258866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Oct 10 10:30:08 compute-2 sudo[258866]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.384 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:30:08 compute-2 ceph-mon[74913]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:08 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1914201702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.545 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.547 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4828MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.610 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.610 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Oct 10 10:30:08 compute-2 nova_compute[235775]: 2025-10-10 10:30:08.624 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Oct 10 10:30:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:08 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:08 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:09 compute-2 podman[258989]: 2025-10-10 10:30:09.020921756 +0000 UTC m=+0.084800697 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 10:30:09 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 10:30:09 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/919033109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:09 compute-2 nova_compute[235775]: 2025-10-10 10:30:09.134 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Oct 10 10:30:09 compute-2 podman[258989]: 2025-10-10 10:30:09.138305514 +0000 UTC m=+0.202184425 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 10:30:09 compute-2 nova_compute[235775]: 2025-10-10 10:30:09.139 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Oct 10 10:30:09 compute-2 nova_compute[235775]: 2025-10-10 10:30:09.158 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Oct 10 10:30:09 compute-2 nova_compute[235775]: 2025-10-10 10:30:09.160 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Oct 10 10:30:09 compute-2 nova_compute[235775]: 2025-10-10 10:30:09.160 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:30:09 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:09 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:09 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:09.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:09 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 10:30:09 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/919033109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:09 compute-2 podman[259110]: 2025-10-10 10:30:09.564079519 +0000 UTC m=+0.056678500 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:30:09 compute-2 podman[259110]: 2025-10-10 10:30:09.596407382 +0000 UTC m=+0.089006353 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 10:30:09 compute-2 podman[259201]: 2025-10-10 10:30:09.94712064 +0000 UTC m=+0.064994416 container exec eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 10:30:09 compute-2 podman[259201]: 2025-10-10 10:30:09.961161768 +0000 UTC m=+0.079035454 container exec_died eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:09 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:09 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:10 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:10 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:10 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:10.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:10 compute-2 nova_compute[235775]: 2025-10-10 10:30:10.128 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:10 compute-2 nova_compute[235775]: 2025-10-10 10:30:10.145 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:10 compute-2 podman[259266]: 2025-10-10 10:30:10.203753994 +0000 UTC m=+0.064513711 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:30:10 compute-2 podman[259266]: 2025-10-10 10:30:10.238222295 +0000 UTC m=+0.098981982 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 10:30:10 compute-2 podman[259334]: 2025-10-10 10:30:10.441077591 +0000 UTC m=+0.048342145 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.expose-services=, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2)
Oct 10 10:30:10 compute-2 podman[259334]: 2025-10-10 10:30:10.453126106 +0000 UTC m=+0.060390650 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 10:30:10 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:10 compute-2 ceph-mon[74913]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:10 compute-2 sudo[258866]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:10 compute-2 sudo[259404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Oct 10 10:30:10 compute-2 sudo[259404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:10 compute-2 sudo[259404]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:10 compute-2 sudo[259429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Oct 10 10:30:10 compute-2 sudo[259429]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:10 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:10 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:11 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:11 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:11 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:11.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:11 compute-2 sudo[259429]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:11 compute-2 nova_compute[235775]: 2025-10-10 10:30:11.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1921870048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:11 compute-2 ceph-mon[74913]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 10:30:11 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3445838709' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:11 compute-2 nova_compute[235775]: 2025-10-10 10:30:11.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Oct 10 10:30:11 compute-2 nova_compute[235775]: 2025-10-10 10:30:11.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Oct 10 10:30:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:11 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:11 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:12 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:12 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:12 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:12.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-mon[74913]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 1 op/s
Oct 10 10:30:12 compute-2 ceph-mon[74913]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 721 B/s rd, 0 op/s
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:12 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:12 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:13 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:13 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:13 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:13.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:13 compute-2 nova_compute[235775]: 2025-10-10 10:30:13.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:13 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:13 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:14 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:14 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:14 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:14.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:14 compute-2 ceph-mon[74913]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 1 op/s
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:14 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:14 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:15 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:15 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:15 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:15.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:15 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:15 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:15 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:16 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:16 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:16 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:16.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:16 compute-2 nova_compute[235775]: 2025-10-10 10:30:16.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:16 compute-2 sudo[259491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Oct 10 10:30:16 compute-2 sudo[259491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:16 compute-2 sudo[259491]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:16 compute-2 ceph-mon[74913]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 1 op/s
Oct 10 10:30:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:30:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:16 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 10:30:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:16 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:16 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:17 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:17 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:17 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:17.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:17 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/862817915' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:17 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:17 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:18 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:18 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:18 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:18 compute-2 nova_compute[235775]: 2025-10-10 10:30:18.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:18 compute-2 ceph-mon[74913]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 1 op/s
Oct 10 10:30:18 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/832635326' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 10:30:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:18 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:18 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:19 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:19 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:19 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:19.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:19 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:19 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:20 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:20 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:20 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:20 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:20 compute-2 ceph-mon[74913]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 1 op/s
Oct 10 10:30:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:20 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:20 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:21 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:21 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:21 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:21 compute-2 nova_compute[235775]: 2025-10-10 10:30:21.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:21 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:21 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:22 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:22 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:22 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:22.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:22 compute-2 ceph-mon[74913]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Oct 10 10:30:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:22 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:22 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:23 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:23 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:23 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:23.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:23 compute-2 nova_compute[235775]: 2025-10-10 10:30:23.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:23 compute-2 sudo[259523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:30:23 compute-2 sudo[259523]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:23 compute-2 sudo[259523]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:23 compute-2 podman[259549]: 2025-10-10 10:30:23.695549134 +0000 UTC m=+0.063740137 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 10:30:23 compute-2 podman[259547]: 2025-10-10 10:30:23.70386793 +0000 UTC m=+0.070508623 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 10 10:30:23 compute-2 podman[259548]: 2025-10-10 10:30:23.728872748 +0000 UTC m=+0.097097962 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:30:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:23 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:23 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:24 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:24 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:24 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:24 compute-2 ceph-mon[74913]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:24 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:24 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:25 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:25 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:25 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:25.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:25 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:25 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:25 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:26 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:26 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:26 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:26 compute-2 nova_compute[235775]: 2025-10-10 10:30:26.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 10:30:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:30:26 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 10:30:26 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:30:26 compute-2 ceph-mon[74913]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 10:30:26 compute-2 ceph-mon[74913]: from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 10:30:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:26 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:26 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:27 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:27 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:27 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:27 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:27 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:28 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:28 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:28 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:28.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:28 compute-2 nova_compute[235775]: 2025-10-10 10:30:28.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:28 compute-2 ceph-mon[74913]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:28 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:28 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:29 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:29 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:29 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:29 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:29 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:30 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:30 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:30 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:30.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:30 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:30 compute-2 ceph-mon[74913]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:30 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:30 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:31 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:31 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:31 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:31.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:31 compute-2 nova_compute[235775]: 2025-10-10 10:30:31.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:31 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:30:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:31 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:31 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:32 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:32 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:32 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:32.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:32 compute-2 ceph-mon[74913]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:32 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:32 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:33 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:33 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:33 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:33.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:33 compute-2 nova_compute[235775]: 2025-10-10 10:30:33.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:33 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:33 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:34 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:34 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:30:34 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:34.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:30:34 compute-2 sshd-session[259622]: Accepted publickey for zuul from 192.168.122.10 port 46742 ssh2: ECDSA SHA256:OTD5B+ahDqExNS+mhJP5lz4CJKQqbHlXujfiLvlujac
Oct 10 10:30:34 compute-2 systemd-logind[796]: New session 60 of user zuul.
Oct 10 10:30:34 compute-2 systemd[1]: Started Session 60 of User zuul.
Oct 10 10:30:34 compute-2 sshd-session[259622]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 10 10:30:34 compute-2 sudo[259626]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 10 10:30:34 compute-2 sudo[259626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 10 10:30:34 compute-2 ceph-mon[74913]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:34 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:34 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:35 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:35 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:35 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:35.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:35 compute-2 podman[259662]: 2025-10-10 10:30:35.327893293 +0000 UTC m=+0.081102200 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 10 10:30:35 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:35 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:35 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:36 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:36 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:36 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:36.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:36 compute-2 nova_compute[235775]: 2025-10-10 10:30:36.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:36 compute-2 ceph-mon[74913]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:36 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:36 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:37 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:37 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:37 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:37.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:37 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 10:30:37 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1165768626' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.27688 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.26969 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.18153 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.27703 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.18159 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.26975 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1165768626' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:30:37 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/43274649' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:30:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:37 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:37 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:38 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:38 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 10:30:38 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 10:30:38 compute-2 nova_compute[235775]: 2025-10-10 10:30:38.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:38 compute-2 ceph-mon[74913]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:38 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3130338205' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 10:30:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:38 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:38 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:39 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:39 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:39 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.669300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239669368, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1594, "num_deletes": 258, "total_data_size": 3890492, "memory_usage": 3937872, "flush_reason": "Manual Compaction"}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239686596, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2541098, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39513, "largest_seqno": 41101, "table_properties": {"data_size": 2534363, "index_size": 3806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14470, "raw_average_key_size": 20, "raw_value_size": 2520675, "raw_average_value_size": 3486, "num_data_blocks": 164, "num_entries": 723, "num_filter_entries": 723, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760092107, "oldest_key_time": 1760092107, "file_creation_time": 1760092239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 17351 microseconds, and 9966 cpu microseconds.
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.686659) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2541098 bytes OK
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.686683) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687900) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687919) EVENT_LOG_v1 {"time_micros": 1760092239687912, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687941) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3883108, prev total WAL file size 3883108, number of live WAL files 2.
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.689476) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303036' seq:72057594037927935, type:22 .. '6C6F676D0031323630' seq:0, type:0; will stop at (end)
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2481KB)], [75(12MB)]
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239689502, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15157428, "oldest_snapshot_seqno": -1}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6866 keys, 14996081 bytes, temperature: kUnknown
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239772856, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14996081, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14950753, "index_size": 27040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17221, "raw_key_size": 180370, "raw_average_key_size": 26, "raw_value_size": 14827539, "raw_average_value_size": 2159, "num_data_blocks": 1069, "num_entries": 6866, "num_filter_entries": 6866, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760092239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.773057) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14996081 bytes
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.774147) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.7 rd, 179.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.0 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(11.9) write-amplify(5.9) OK, records in: 7400, records dropped: 534 output_compression: NoCompression
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.774162) EVENT_LOG_v1 {"time_micros": 1760092239774154, "job": 46, "event": "compaction_finished", "compaction_time_micros": 83420, "compaction_time_cpu_micros": 26259, "output_level": 6, "num_output_files": 1, "total_output_size": 14996081, "num_input_records": 7400, "num_output_records": 6866, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239774629, "job": 46, "event": "table_file_deletion", "file_number": 77}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239776317, "job": 46, "event": "table_file_deletion", "file_number": 75}
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.689417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776357) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776366) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:39 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:39 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:40 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:40 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:40 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:40 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:40 compute-2 ovs-vsctl[259968]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 10 10:30:40 compute-2 ceph-mon[74913]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:40 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:40 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:41 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:41 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:41 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:41.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:41 compute-2 nova_compute[235775]: 2025-10-10 10:30:41.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Oct 10 10:30:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.486 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Oct 10 10:30:41 compute-2 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.486 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Oct 10 10:30:41 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 10 10:30:41 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 10 10:30:41 compute-2 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 10:30:41 compute-2 unix_chkpwd[260114]: password check failed for user (root)
Oct 10 10:30:41 compute-2 sshd-session[260008]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 10 10:30:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:41 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:41 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:42 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: cache status {prefix=cache status} (starting...)
Oct 10 10:30:42 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:42 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:42 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:42 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: client ls {prefix=client ls} (starting...)
Oct 10 10:30:42 compute-2 lvm[260315]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 10:30:42 compute-2 lvm[260315]: VG ceph_vg0 finished
Oct 10 10:30:42 compute-2 ceph-mon[74913]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:42 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: damage ls {prefix=damage ls} (starting...)
Oct 10 10:30:42 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Oct 10 10:30:42 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/561687218' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:42 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump loads {prefix=dump loads} (starting...)
Oct 10 10:30:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:42 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:42 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct 10 10:30:43 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:43 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:43 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:43.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:43 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Oct 10 10:30:43 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1504914682' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct 10 10:30:43 compute-2 nova_compute[235775]: 2025-10-10 10:30:43.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct 10 10:30:43 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Oct 10 10:30:43 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/171472224' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:30:43 compute-2 sudo[260564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Oct 10 10:30:43 compute-2 sudo[260564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Oct 10 10:30:43 compute-2 sudo[260564]: pam_unix(sudo:session): session closed for user root
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.27724 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/561687218' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.27736 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1504914682' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.18186 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.27751 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4271142430' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/171472224' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:30:43 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct 10 10:30:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:43 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:43 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:44 compute-2 sshd-session[260008]: Failed password for root from 80.94.93.119 port 50722 ssh2
Oct 10 10:30:44 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:44 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:44 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:44.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:44 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: ops {prefix=ops} (starting...)
Oct 10 10:30:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Oct 10 10:30:44 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2269363297' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Oct 10 10:30:44 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/17616546' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 10:30:44 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1752163504' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: session ls {prefix=session ls} (starting...)
Oct 10 10:30:44 compute-2 ceph-mon[74913]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.18204 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.27763 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.26996 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3593539214' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/206671409' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2269363297' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.18222 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1064114185' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.27008 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/17616546' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2810672141' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.27808 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1752163504' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3250802086' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2558230964' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:30:44 compute-2 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: status {prefix=status} (starting...)
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:44 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:44 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:45 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:45 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:45 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:45.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Oct 10 10:30:45 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3206344806' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 10:30:45 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2827576184' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:45 compute-2 unix_chkpwd[260841]: password check failed for user (root)
Oct 10 10:30:45 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 10:30:45 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2830212554' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.18243 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.27029 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.27832 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/132749714' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.27044 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.27850 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/731166632' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2494599266' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3206344806' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/565812220' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3277923492' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2827576184' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2830212554' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:30:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:45 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:45 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 10:30:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107248748' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:46 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:46 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:46 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:46.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:46 compute-2 nova_compute[235775]: 2025-10-10 10:30:46.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 10:30:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3723288549' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Oct 10 10:30:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2575731666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.18297 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.27077 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3111998569' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1189640479' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3172139523' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2107248748' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.27092 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.27901 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/110223178' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3723288549' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3461548462' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/547263865' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3113939320' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2575731666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 10:30:46 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1715209047' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:46 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:46 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Oct 10 10:30:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/910388127' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:30:47 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:47 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:47 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:47.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 10:30:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4142265369' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:47 compute-2 sshd-session[260008]: Failed password for root from 80.94.93.119 port 50722 ssh2
Oct 10 10:30:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 10:30:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1724959652' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:47 compute-2 unix_chkpwd[261214]: password check failed for user (root)
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3006396374' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.18363 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1715209047' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2912853044' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3051349397' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/910388127' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3498770159' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.27949 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2040415731' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3858622949' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.27140 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4142265369' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1724959652' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/683696595' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2898906544' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 10:30:47 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1085433634' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:47 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:47 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:48 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:48 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:48 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:48.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:58.984357+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:57:59.984514+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:00.984659+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:01.984874+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:02.985022+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:03.985195+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:04.985348+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:05.985484+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:06.985664+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:07.985782+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:08.985880+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:09.986034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:10.986172+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:11.986304+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:12.986448+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:13.986602+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:14.986768+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:15.986890+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:16.987534+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:17.987651+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:18.987924+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:19.988099+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:20.988307+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:21.988546+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:22.988762+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:23.988974+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:24.989265+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:25.989485+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:26.989692+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:27.989986+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:28.990167+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:29.990301+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:30.990555+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:31.990719+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:32.990881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:33.991043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:34.991161+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:35.991399+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:36.991588+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:37.991728+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:38.991919+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:39.992039+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:40.992216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:41.992359+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:42.992591+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:43.992773+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:44.992935+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:45.993058+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:46.993202+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:47.993323+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:48.993546+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:49.993715+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:50.993878+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:51.994007+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:52.994211+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:53.994527+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:54.994746+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:55.994945+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:56.995069+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:57.995257+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:58.995432+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:58:59.995611+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:00.995780+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:01.995952+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:02.996086+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:03.996216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:04.996331+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:05.996480+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:06.996720+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:07.996879+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:08.997065+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:09.997188+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:10.997327+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:11.997457+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:12.997594+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:13.998034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:14.998212+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:15.998331+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:16.998499+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:17.998642+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:18.998955+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:19.999136+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:20.999294+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:21.999447+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:22.999608+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:23.999800+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:24.999983+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:26.000114+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:27.000293+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:28.000406+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:29.000587+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:30.000758+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:31.000939+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:32.001088+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:33.001275+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:34.001553+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:35.001708+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:36.001852+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:37.001984+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:38.002122+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:39.002291+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:40.002461+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:41.002606+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:42.002741+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:43.002889+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:44.003067+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:45.003253+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb257c9860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:46.003666+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:47.003738+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:48.003874+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:49.004103+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:50.004233+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:51.004373+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:52.004544+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:53.004676+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:54.004848+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:55.004999+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:56.005139+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257770e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:57.005283+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:58.005481+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T09:59:59.006310+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:00.006495+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:01.006638+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:02.006781+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:03.006885+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:04.007043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:05.007166+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:06.007291+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:07.007448+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:08.007574+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:09.007701+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:10.007865+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 193.200393677s of 193.215057373s, submitted: 3
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:11.008007+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:12.008131+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:13.008257+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888693 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:14.008417+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:15.008555+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:16.008680+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:17.008791+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:18.008909+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:19.009020+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:20.009140+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:21.009300+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:22.010192+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:23.010677+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:24.010900+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:25.011908+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:26.012042+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:27.012290+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:28.012618+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:29.012772+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:30.012915+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:31.013535+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:32.013721+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:33.014024+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:34.014364+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:35.014907+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a3c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:36.015092+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:37.015274+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:38.015472+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:39.015817+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:40.016036+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:41.016327+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:42.016620+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:43.016903+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:44.017165+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:45.017384+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.551929474s of 34.587165833s, submitted: 2
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:46.017571+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:47.017763+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:48.017979+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891717 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:49.018131+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:50.018271+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:51.018519+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:52.018684+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:53.018870+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:54.019081+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:55.019225+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:56.019371+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:57.019518+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:58.019902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb25234960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:00:59.020105+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:00.020433+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:01.020709+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:02.020960+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:03.021174+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:04.021412+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:05.021563+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:06.021751+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:07.022054+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:08.022310+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:09.022517+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:10.022808+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:11.023043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:12.023228+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.263769150s of 27.276128769s, submitted: 4
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:13.023460+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:14.023758+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:15.023925+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:16.024163+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:17.024368+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:18.024569+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:19.024745+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:20.024862+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:21.024985+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:22.025095+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:23.025334+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:24.026084+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:25.027084+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:26.028856+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:27.029018+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:28.029165+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:29.029353+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:30.029795+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb22622400 session 0x55cb23c641e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb229ed400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:31.029946+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:32.030146+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:33.030319+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:34.030646+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:35.031536+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:36.031938+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:37.032311+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:38.032482+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:39.032667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:40.033764+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb252354a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:41.033930+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:42.034107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:43.034363+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:44.034612+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:45.034885+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:46.035092+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:47.035233+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:48.035366+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:49.035485+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:50.035618+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:51.035747+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:52.036131+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:53.036307+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:54.036526+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.978160858s of 41.991683960s, submitted: 4
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:55.036660+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:56.036872+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:57.037113+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:58.037279+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897504 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:01:59.037406+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:00.037531+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:01.037657+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:02.037776+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:03.037923+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:04.038096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:05.038355+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:06.038544+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:07.038692+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:08.038817+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:09.039043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:10.039250+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:11.039391+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:12.039555+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:13.039719+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:14.039915+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:15.040067+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:16.040314+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:17.040469+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:18.040667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:19.040814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:20.041017+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:21.041350+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:22.041515+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:23.041754+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:24.041984+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:25.042111+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:26.042228+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:27.042517+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:28.042697+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:29.043240+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:30.043446+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:31.043618+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:32.045367+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 745472 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:33.046663+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:34.046909+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397f4a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:35.047141+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:36.047304+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:37.048023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:38.048479+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:39.048615+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:40.049188+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:41.049329+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:42.049470+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:43.049691+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:44.049896+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:45.050025+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:46.050149+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:47.050599+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:48.051463+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.931362152s of 53.942428589s, submitted: 3
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:49.051867+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:50.052247+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:51.052535+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:52.052707+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:53.053763+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:54.054259+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:55.055116+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:56.055330+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:57.055668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:58.055913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:02:59.056184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:00.056327+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:01.057259+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:02.057431+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:03.057674+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:04.057903+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:05.058123+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:06.058318+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:07.058603+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:08.058786+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:09.058946+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:10.059114+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.023803711s of 22.031444550s, submitted: 2
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:11.059297+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:12.059437+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:13.059575+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:14.059749+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:15.059894+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:16.060018+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:17.060184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:18.060342+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:19.060513+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:20.060653+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:21.060798+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:22.060927+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:23.061024+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:24.061420+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:25.061587+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:26.061726+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb256783c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:27.061893+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:28.061997+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:29.062178+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:30.062371+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:31.062524+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:32.062682+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:33.062842+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:34.063652+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:35.063778+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:36.064415+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:37.064629+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:38.065258+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:39.065402+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.378730774s of 29.382411957s, submitted: 1
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:40.065607+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:41.065739+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:42.065884+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:43.066004+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:44.066379+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:45.066536+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:46.066673+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:47.066810+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:48.067075+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:49.067241+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:50.067391+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:51.067523+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:52.067728+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:53.068004+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:54.068247+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:55.068448+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:56.068623+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:57.068800+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:58.068982+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb256ab0e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:03:59.069224+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:00.069433+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:01.069688+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:02.069945+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:03.070102+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:04.070280+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:05.070439+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:06.070804+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:07.071111+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:08.071311+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:09.071531+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:10.073288+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:11.073532+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:12.073667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.485904694s of 32.491119385s, submitted: 1
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:13.073814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:14.074019+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:15.074209+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:16.074333+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:17.074580+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:18.074936+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:19.075109+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:20.075282+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:21.075523+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:22.075681+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:23.075984+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:24.076465+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:25.076721+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:26.076940+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:27.077307+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:28.077507+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:29.077694+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:30.078003+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:31.078206+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:32.078391+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:33.078696+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:34.078935+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:35.079205+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:36.079375+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:37.079627+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:38.080032+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:39.080222+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:40.081433+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:41.081711+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:42.081902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:43.082176+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:44.082346+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:45.082480+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:46.083292+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:47.083431+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:48.083554+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:49.083682+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:50.083769+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:51.083881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:52.083982+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:53.084046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:54.084302+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:55.084430+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:56.084569+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:57.085343+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:58.085522+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:04:59.085745+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:00.086100+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:01.086240+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:02.086363+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:03.086513+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:04.086775+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:05.087010+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:06.087305+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:07.087494+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:08.087633+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:09.087786+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:10.087892+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:11.088012+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:12.088188+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:13.088357+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:14.088531+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:15.088667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:16.088897+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:17.089035+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:18.089189+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:19.089409+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 5994 writes, 24K keys, 5994 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 5994 writes, 1097 syncs, 5.46 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 448 writes, 699 keys, 448 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s
                                           Interval WAL: 448 writes, 217 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb215109b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:20.089571+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:21.089794+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:22.089992+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:23.090197+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:24.090386+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:25.090587+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:26.090774+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:27.090978+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:28.091136+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:29.091317+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:30.091488+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:31.091619+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:32.091797+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:33.092091+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:34.092373+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:35.092601+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:36.092778+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:37.092980+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:38.093108+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:39.093256+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:40.093374+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:41.093556+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:42.093715+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:43.093891+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:44.094106+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:45.094347+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:46.094560+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:47.094818+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:48.095137+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb257c9e00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:49.095371+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:50.095537+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:51.095798+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:52.096097+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:53.096331+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:54.096559+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:55.096699+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:56.096937+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:57.097155+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:58.097338+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:05:59.097477+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:00.097641+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:01.097811+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:02.098070+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:03.098293+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:04.098494+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:05.098731+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:06.098952+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:07.099165+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:08.099366+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 116.097770691s of 116.122451782s, submitted: 2
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:09.099565+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:10.099710+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:11.099954+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:12.100138+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:13.100319+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:14.100602+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:15.102526+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:16.102668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:17.104229+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:18.105623+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:19.106902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:20.107194+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:21.108245+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:22.109126+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:23.109961+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:24.110751+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:25.111080+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:26.111656+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:27.111888+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:28.112361+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:29.112868+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:30.113125+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.223886490s of 22.228187561s, submitted: 1
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:31.113564+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:32.113931+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb25216f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:33.114303+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:34.114623+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:35.114930+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 1490944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:36.115134+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:37.115465+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:38.115597+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:39.115896+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:40.116073+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:41.116230+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:42.116674+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:43.116984+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:44.117301+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:45.117549+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:46.117713+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.055137634s of 15.676420212s, submitted: 212
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:47.117921+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:48.118106+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257c9680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:49.118308+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904542 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:50.118499+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:51.118673+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:52.118805+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:53.118970+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:54.119184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903951 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:55.119410+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:56.119629+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:57.119963+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:58.120246+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a952c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:06:59.120457+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:00.120665+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:01.120945+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:02.121124+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:03.121360+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:04.121613+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:05.121800+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:06.121981+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:07.122168+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:08.122387+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:09.122609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:10.122857+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:11.123050+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:12.123225+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:13.123419+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:14.123624+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:15.123908+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.002698898s of 29.016599655s, submitted: 4
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:16.124136+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:17.124408+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:18.124590+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:19.124748+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:20.124905+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:21.125075+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:22.125302+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:23.125471+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:24.125641+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:25.125792+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:26.125930+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:27.126029+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:28.126177+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:29.126277+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:30.126399+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:31.126526+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:32.126644+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:33.126799+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:34.126988+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:35.127100+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:36.127212+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:37.127349+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:38.127437+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:39.127567+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:40.127705+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:41.127902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:42.128082+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:43.128281+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:44.128450+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:45.128621+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:46.128814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:47.129082+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:48.129227+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:49.129394+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:50.129659+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:51.129812+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:52.129955+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:53.130104+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:54.130280+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:55.130417+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:56.130583+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:57.130741+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:58.130892+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:07:59.131010+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:00.131223+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:01.131377+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:02.131532+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:03.131730+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:04.132101+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:05.132293+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:06.132428+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:07.132569+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:08.132776+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:09.132954+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:10.133071+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:11.133261+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:12.133476+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:13.133649+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:14.133869+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:15.134034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:16.134191+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:17.134372+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:18.134475+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:19.134690+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:20.134895+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:21.135060+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:22.135241+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24f1ab40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:23.135441+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:24.135618+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:25.135773+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:26.135940+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:27.136095+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:28.136244+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:29.136353+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:30.136528+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:31.136662+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:32.136880+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:33.137090+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:34.137277+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:35.137463+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:36.137596+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.090309143s of 81.102882385s, submitted: 3
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:37.137747+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:38.137968+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:39.138127+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:40.138339+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905793 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:41.138508+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:42.138651+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:43.138954+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:44.139190+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:45.139430+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:46.139586+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:47.139749+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:48.139948+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:49.140218+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:50.140410+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb25216000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:51.140609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:52.140867+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:53.141113+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:54.141385+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:55.141545+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:56.141725+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:57.141902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:58.142055+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:08:59.142202+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:00.142363+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:01.142542+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:02.142697+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:03.142931+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:04.143196+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:05.143345+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442283630s of 28.465816498s, submitted: 2
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:06.143518+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:07.143668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:08.143926+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:09.144083+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:10.144240+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:11.144400+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:12.144471+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:13.144630+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:14.144809+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:15.145010+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:16.145170+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:17.145329+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:18.145470+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:19.145641+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:20.145803+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb252174a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:21.145954+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.102451324s of 16.118749619s, submitted: 3
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 237568 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:22.146086+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:23.146210+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 146 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c54a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:24.146450+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0xfdb10/0x1b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca64000/0x0/0x4ffc00000, data 0xffae2/0x1b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:25.146682+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926645 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _renew_subs
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb252b1c00 session 0x55cb23dda000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 65536 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:26.146979+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:27.147203+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:28.147417+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:29.147655+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:30.147877+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929443 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:31.148023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:32.148178+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:33.148302+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:34.148473+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.733337402s of 13.866815567s, submitted: 54
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:35.148606+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930955 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:36.148742+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:37.149015+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:38.149124+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:39.149292+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:40.149473+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931459 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:41.149611+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:42.149757+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:43.149926+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:44.150108+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:45.150268+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930277 data_alloc: 218103808 data_used: 53248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:46.150381+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:47.150579+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:48.150754+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:49.150923+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:50.151103+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:51.151297+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:52.151447+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:53.151646+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:54.151910+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:55.152107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:56.152324+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:57.152483+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:58.152689+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:09:59.152898+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:00.153146+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:01.153329+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:02.153512+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:03.153689+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:04.153950+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb2515f000 session 0x55cb256aa1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.409894943s of 29.425935745s, submitted: 4
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb25754c00 session 0x55cb256aa5a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb23c79400 session 0x55cb25679e00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 10:30:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2303646199' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:05.154144+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:06.154381+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1228800 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fca5f000/0x0/0x4ffc00000, data 0x103cdd/0x1bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:07.154535+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a1e960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb2515f000 session 0x55cb25217860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb252b1c00 session 0x55cb250810e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb25755000 session 0x55cb23ddb860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23c79400 session 0x55cb23ddb0e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 11272192 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:08.154683+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 11264000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:09.154910+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 10215424 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:10.155048+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019665 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24c6a1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:11.155216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:12.155367+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f000 session 0x55cb25080d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:13.155551+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:14.155759+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb252a34a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.058108330s of 10.286009789s, submitted: 78
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257765a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:15.155964+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018571 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:16.156175+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 80191488 unmapped: 8912896 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:17.156323+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:18.156507+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:19.156756+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:20.156987+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1085887 data_alloc: 234881024 data_used: 10084352
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:21.157197+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:22.157324+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:23.157476+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:24.157656+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:25.157816+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1086640 data_alloc: 234881024 data_used: 10084352
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:26.158149+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.044337273s of 12.061671257s, submitted: 5
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:27.158358+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 95715328 unmapped: 4505600 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:28.158519+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96894976 unmapped: 3325952 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:29.158695+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97017856 unmapped: 3203072 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:30.158814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:31.158942+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:32.159094+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:33.159247+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:34.159430+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:35.159579+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:36.159741+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:37.159898+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:38.160025+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:39.160152+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:40.160301+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:41.160476+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:42.160974+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:43.161123+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:44.161333+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:45.161514+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:46.161668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:47.161820+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:48.161955+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb2397f2c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb2397e960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:49.162087+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.521245956s of 22.787330627s, submitted: 111
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24c590e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb251034a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96673792 unmapped: 3547136 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:50.162206+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207099 data_alloc: 234881024 data_used: 10989568
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25103680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb25102780
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24a1f860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb24a1ed20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb24a1f0e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb22a681e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:51.162371+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:52.162523+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:53.162610+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397e3c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:54.162784+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:55.162928+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274581 data_alloc: 234881024 data_used: 10989568
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:56.163074+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:57.163191+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22a501e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:58.163349+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb222bcf00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:10:59.163472+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:00.163593+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb257765a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.540281296s of 10.644290924s, submitted: 20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25776f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280396 data_alloc: 234881024 data_used: 10989568
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97419264 unmapped: 16449536 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:01.163742+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:02.163894+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:03.164036+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100958208 unmapped: 12910592 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:04.164617+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:05.165248+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337852 data_alloc: 234881024 data_used: 19390464
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:06.165703+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:07.166232+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:08.166586+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:09.166753+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:10.167004+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339364 data_alloc: 234881024 data_used: 19390464
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622159004s of 10.657759666s, submitted: 8
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:11.167274+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:12.167440+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:13.167668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:14.167921+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111075328 unmapped: 3850240 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:15.168104+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111214592 unmapped: 3710976 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ead000/0x0/0x4ffc00000, data 0x2b07ea4/0x2bc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:16.168274+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:17.168429+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:18.168567+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:19.168685+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:20.168878+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:21.169057+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.157508850s of 10.318835258s, submitted: 79
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:22.169197+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:23.169412+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257774a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22e632c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:24.169605+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb25235860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:25.169877+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220056 data_alloc: 234881024 data_used: 10858496
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:26.170037+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:27.170189+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:28.170422+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:29.170609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:30.170778+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb252165a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb251023c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 12288000 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969360 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb23c64d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:31.170964+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93929472 unmapped: 20996096 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:32.171170+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:33.171358+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:34.171555+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:35.171738+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:36.171954+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:37.172098+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:38.172276+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:39.172412+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:40.172565+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:41.172734+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:42.172901+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:43.173079+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:44.173273+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:45.173429+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:46.173567+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:47.173666+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:48.173776+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:49.173894+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:50.174021+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:51.174220+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:52.174426+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:53.174603+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:54.174796+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:55.174915+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:56.175021+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.769847870s of 34.920715332s, submitted: 64
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97558528 unmapped: 28000256 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb256aba40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:57.175155+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:58.175285+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:11:59.175363+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:00.175473+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb256aa960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:01.175606+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:02.175707+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:03.175806+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:04.175983+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:05.176109+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb256aad20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:06.176260+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:07.176394+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:08.176811+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:09.177772+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:10.178534+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:11.179145+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:12.179456+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:13.179597+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:14.180023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:15.180196+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:16.180631+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99991552 unmapped: 25567232 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:17.180921+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100007936 unmapped: 25550848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.422227859s of 21.509000778s, submitted: 20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:18.181076+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 18571264 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:19.181450+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 17088512 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:20.181641+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108478464 unmapped: 17080320 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319733 data_alloc: 234881024 data_used: 14139392
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:21.181986+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:22.182140+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:23.182524+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:24.182819+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:25.183046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311406 data_alloc: 234881024 data_used: 14139392
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:26.183305+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:27.183566+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f994a000/0x0/0x4ffc00000, data 0x2077def/0x2132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:28.183722+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:29.183925+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107175936 unmapped: 18382848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.874601364s of 12.195711136s, submitted: 124
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:30.184154+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 17334272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:31.184413+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:32.184632+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:33.184917+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:34.185186+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:35.185379+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:36.185591+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:37.185942+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:38.186138+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9941000/0x0/0x4ffc00000, data 0x2080def/0x213b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:39.186292+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:40.186519+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311694 data_alloc: 234881024 data_used: 14151680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:41.186722+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:42.186912+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:43.187038+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.061926842s of 14.079800606s, submitted: 4
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:44.187230+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 17195008 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:45.187392+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25678f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 17186816 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982200 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:46.187524+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb25824b40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98844672 unmapped: 26714112 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:47.187667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:48.187850+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:49.188118+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:50.188238+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983712 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:51.188445+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:52.188613+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:53.188770+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:54.188950+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:55.189113+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:56.189282+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:57.189442+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:58.189569+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:12:59.189710+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:00.189846+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:01.189995+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 nova_compute[235775]: 2025-10-10 10:30:48.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:02.190120+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:03.190283+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:04.190492+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:05.190652+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:06.190797+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:07.190928+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:08.191066+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:09.191236+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:10.191381+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:11.191542+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:12.191683+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24f1b4a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24f1ad20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f1a3c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a780
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.527706146s of 28.564867020s, submitted: 20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 24657920 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f1a1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:13.191893+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:14.192725+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:15.193308+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:16.193853+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065364 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb249925a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:17.194925+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b1c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24992f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98213888 unmapped: 31023104 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24c6be00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:18.195393+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c6a1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faea5000/0x0/0x4ffc00000, data 0xb1cdef/0xbd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:19.196175+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:20.196761+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97427456 unmapped: 31809536 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:21.196962+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1092582 data_alloc: 218103808 data_used: 3399680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99352576 unmapped: 29884416 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:22.197524+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99434496 unmapped: 29802496 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:23.197901+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99467264 unmapped: 29769728 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:24.198586+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:25.198801+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:26.198988+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1131798 data_alloc: 234881024 data_used: 9220096
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:27.199368+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:28.199609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:29.199892+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:30.200198+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:31.200481+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133926 data_alloc: 234881024 data_used: 9277440
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.591819763s of 18.735073090s, submitted: 24
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107536384 unmapped: 21700608 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:32.200730+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:33.200901+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:34.201079+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:35.201267+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:36.201421+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232606 data_alloc: 234881024 data_used: 9793536
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107724800 unmapped: 21512192 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:37.201579+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106569728 unmapped: 22667264 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:38.201894+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25754800 session 0x55cb25081680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:39.202120+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:40.202293+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:41.202455+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230270 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:42.202922+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:43.203089+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.891391754s of 12.132454872s, submitted: 125
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:44.203325+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed7000/0x0/0x4ffc00000, data 0x16d9dff/0x1795000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:45.203421+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:46.203661+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230430 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:47.203951+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:48.204098+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb24f1ba40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515ec00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515ec00 session 0x55cb24a1bc20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:49.204219+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106627072 unmapped: 22609920 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:50.204355+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f97ba000/0x0/0x4ffc00000, data 0x1df6dff/0x1eb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107798528 unmapped: 21438464 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:51.204511+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296232 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9630000/0x0/0x4ffc00000, data 0x1f80dff/0x203c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:52.204663+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:53.204871+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:54.205060+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:55.205145+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.719927788s of 11.824364662s, submitted: 25
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24a1e960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962d000/0x0/0x4ffc00000, data 0x1f83dff/0x203f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107929600 unmapped: 21307392 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:56.205268+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25754800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300762 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107921408 unmapped: 21315584 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:57.205388+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:58.205541+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:13:59.205673+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:00.205785+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:01.205913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360346 data_alloc: 234881024 data_used: 18640896
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114458624 unmapped: 14778368 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:02.206046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 14745600 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:03.206769+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:04.206930+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb2397f2c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:05.207045+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:06.207190+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1361578 data_alloc: 234881024 data_used: 18644992
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9629000/0x0/0x4ffc00000, data 0x1f84e22/0x2041000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.852634430s of 11.898006439s, submitted: 19
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:07.207308+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122814464 unmapped: 6422528 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:08.207465+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:09.207547+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:10.207735+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:11.207880+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1493940 data_alloc: 234881024 data_used: 20471808
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:12.208011+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:13.208154+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:14.208330+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb237c5a40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb252350e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:15.208438+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:16.208552+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:17.210931+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:18.211540+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:19.212303+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:20.212812+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:21.213539+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23c79400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:22.214244+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:23.215319+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24c6ad20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257163c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.070486069s of 16.434377670s, submitted: 164
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c5c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:24.215493+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:25.215654+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:26.215930+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:27.216189+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:28.216347+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:29.216640+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:30.216899+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:31.217434+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:32.217643+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:33.217820+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:34.218103+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:35.218341+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:36.218551+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:37.218773+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:38.218961+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:39.219345+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:40.219563+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:41.219962+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:42.220137+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:43.220284+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:44.220466+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:45.220643+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:46.220795+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:47.221040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:48.221195+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:49.221406+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:50.221570+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.723497391s of 26.753862381s, submitted: 18
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb239192c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23c1fe00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb252a21e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb250814a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c8d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:51.221796+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:52.221924+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:53.222094+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:54.222249+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:55.222422+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e3c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:56.222573+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106676224 unmapped: 34701312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25678f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257774a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:57.222720+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:58.222875+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:14:59.223040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105660416 unmapped: 35717120 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:00.223261+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:01.223396+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:02.223616+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:03.223781+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:04.224038+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:05.224229+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:06.224416+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:07.224606+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:08.224801+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:09.225035+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.917535782s of 19.063180923s, submitted: 58
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:10.225210+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:11.225342+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ab4000/0x0/0x4ffc00000, data 0x1afae84/0x1bb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:12.225493+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:13.225639+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:14.225847+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:15.225975+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:16.226105+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:17.226228+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:18.226383+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:19.226586+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s
                                           Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s
                                           Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:20.226704+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:21.226850+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326463 data_alloc: 234881024 data_used: 14381056
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:22.227016+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.047369957s of 13.274172783s, submitted: 111
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:23.227133+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:24.227297+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:25.227499+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:26.227737+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326375 data_alloc: 234881024 data_used: 14381056
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:27.227935+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:28.228087+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:29.228261+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:30.228397+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:31.228587+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327543 data_alloc: 234881024 data_used: 14389248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:32.228731+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:33.228959+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:34.229146+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:35.229334+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1f860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d02800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d02800 session 0x55cb250803c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716b40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f1ab40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.387310028s of 13.403597832s, submitted: 5
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:36.229496+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24f1b680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24c6b680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24c6a960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25217860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a61c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397910 data_alloc: 234881024 data_used: 14389248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:37.229743+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:38.229937+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:39.230338+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:40.230518+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:41.230704+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb257c9680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 22839296 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1402283 data_alloc: 234881024 data_used: 14389248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb25755400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:42.230917+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 22822912 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:43.231040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121602048 unmapped: 19775488 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:44.231176+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:45.231303+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:46.231424+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:47.231543+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:48.231688+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9188000/0x0/0x4ffc00000, data 0x2424ef6/0x24e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:49.231919+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:50.232039+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:51.232226+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.817728996s of 15.977606773s, submitted: 48
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:52.232470+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 15040512 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:53.232620+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 11034624 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:54.232776+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8654000/0x0/0x4ffc00000, data 0x2f58ef6/0x3018000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:55.232941+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863e000/0x0/0x4ffc00000, data 0x2f6eef6/0x302e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:56.233080+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131252224 unmapped: 10125312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1564155 data_alloc: 234881024 data_used: 24436736
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:57.233203+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131293184 unmapped: 10084352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:58.233283+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:15:59.233425+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:00.233535+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:01.233734+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560275 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:02.233846+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:03.233975+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:04.234107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395318985s of 12.603665352s, submitted: 111
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:05.234251+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:06.234338+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560555 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:07.234459+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:08.234608+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:09.234741+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:10.234807+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:11.234968+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560411 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:12.235078+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:13.235184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:14.235350+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:15.235503+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:16.235637+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.019625664s of 12.032996178s, submitted: 5
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561307 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:17.235868+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:18.236014+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:19.236193+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:20.236345+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:21.236469+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8629000/0x0/0x4ffc00000, data 0x2f80ef6/0x3040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560963 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:22.236620+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:23.236774+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:24.237006+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:25.237156+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:26.237296+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561267 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:27.237490+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8627000/0x0/0x4ffc00000, data 0x2f84ef6/0x3044000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.661789894s of 10.690342903s, submitted: 10
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 10240000 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:28.237624+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131170304 unmapped: 10207232 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:29.237746+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:30.237857+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb229ed400 session 0x55cb2397fe00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d66c00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:31.237990+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561187 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:32.238136+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:33.238268+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:34.238468+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130646016 unmapped: 10731520 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861f000/0x0/0x4ffc00000, data 0x2f8def6/0x304d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:35.238575+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130727936 unmapped: 10649600 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:36.238708+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560899 data_alloc: 234881024 data_used: 24440832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:37.238859+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:38.239043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 10444800 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.915495872s of 11.539477348s, submitted: 233
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:39.239184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:40.239364+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:41.239539+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1d680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb256ab2c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23dda3c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1344411 data_alloc: 234881024 data_used: 14389248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93d1000/0x0/0x4ffc00000, data 0x1b92e84/0x1c50000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:42.239783+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:43.239955+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:44.240139+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:45.240311+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:46.240425+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1345059 data_alloc: 234881024 data_used: 14389248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:47.240583+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:48.240762+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24a1ed20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25102b40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a3b000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.092136383s of 10.210209846s, submitted: 50
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:49.240949+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25102780
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:50.241216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:51.241523+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:52.241814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:53.242187+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:54.242477+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:55.242647+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:56.242869+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:57.243051+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:58.243213+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:16:59.243382+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:00.243583+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:01.243719+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:02.243903+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:03.244107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:04.244299+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:05.244435+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:06.244607+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:07.244747+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:08.244913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:09.245075+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:10.245262+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:11.245441+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:12.245588+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:13.245722+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:14.245923+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:15.246067+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23944400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24f36f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f37860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f363c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:16.246218+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f374a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.982812881s of 27.189233780s, submitted: 65
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e62960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257770e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129727 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:17.246371+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:18.246505+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257761e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:19.246703+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257774a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25776d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257763c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:20.246902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:21.247096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:22.247266+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166738 data_alloc: 218103808 data_used: 5058560
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:23.247471+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:24.247678+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:25.293996+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113639424 unmapped: 31940608 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:26.294185+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22e62d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.018430710s of 10.130161285s, submitted: 41
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25080780
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 35414016 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25717a40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:27.294315+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:28.294477+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:29.294589+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:30.294883+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:31.296916+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:32.297145+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:33.298096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:34.298465+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:35.299441+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:36.300264+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:37.300472+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:38.300755+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24b1dc20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb23d67400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1c960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:39.300884+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.854639053s of 12.951243401s, submitted: 33
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23c1f4a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb249781e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a61860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2515f400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c9c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257c85a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:40.301464+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:41.302021+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:42.302169+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1126674 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:43.302373+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb257c8000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:44.302511+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a601e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:45.302740+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a605a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb257770e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:46.302882+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: mgrc ms_handle_reset ms_handle_reset con 0x55cb22623000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/194506248
Oct 10 10:30:48 compute-2 ceph-osd[77423]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/194506248,v1:192.168.122.100:6801/194506248]
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: get_auth_request con 0x55cb2515f400 auth_method 0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: mgrc handle_mgr_configure stats_period=5
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:47.303007+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128267 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 39682048 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:48.303230+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113369088 unmapped: 36413440 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:49.303364+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:50.303523+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:51.303680+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:52.303866+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192259 data_alloc: 234881024 data_used: 9543680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.485980988s of 13.640996933s, submitted: 24
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb25776d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257772c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:53.304896+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb252350e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:54.305114+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:55.305273+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:56.305490+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:57.305661+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:58.305803+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:17:59.306008+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:00.306173+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:01.306416+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:02.306570+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:03.306758+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:04.307026+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:05.307167+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:06.307326+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:07.307480+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:08.307730+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:09.307940+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:10.308134+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:11.308368+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:12.308554+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:13.308687+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:14.308913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:15.309071+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a503c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a50000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:16.309250+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.271076202s of 23.354894638s, submitted: 36
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25080780
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e1800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb24b1d680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24979a40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24f37e00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397f2c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:17.309400+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:18.309545+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:19.309751+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:20.309902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:21.310053+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:22.310216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:23.310387+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:24.310595+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:25.310722+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23d3e960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e6000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:26.310918+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:27.311125+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182488 data_alloc: 218103808 data_used: 3469312
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 37814272 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:28.311331+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:29.311589+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:30.311776+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:31.311905+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:32.312046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221704 data_alloc: 234881024 data_used: 9289728
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:33.312217+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:34.312367+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:35.312491+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:36.312598+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:37.312763+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.874525070s of 20.991596222s, submitted: 39
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303724 data_alloc: 234881024 data_used: 9342976
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 26583040 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:38.312904+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f0dff/0x19ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:39.313089+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:40.313255+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:41.313491+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:42.313635+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1373710 data_alloc: 234881024 data_used: 10682368
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:43.313787+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:44.313948+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f972d000/0x0/0x4ffc00000, data 0x1e83dff/0x1f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:45.314115+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:46.314375+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:47.314526+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375014 data_alloc: 234881024 data_used: 10694656
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:48.314665+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:49.314950+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f970f000/0x0/0x4ffc00000, data 0x1ea1dff/0x1f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:50.315102+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:51.315307+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.182755470s of 14.495874405s, submitted: 173
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123781120 unmapped: 26001408 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:52.315467+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23d3f0e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb257163c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1374934 data_alloc: 234881024 data_used: 10694656
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb250e6000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb2397e960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:53.315671+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:54.315819+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:55.316046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:56.316184+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:57.316439+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:58.316731+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:18:59.316938+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:00.317073+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:01.317236+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:02.317487+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:03.317655+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:04.317916+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:05.318034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:06.318205+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:07.318359+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:08.318506+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:09.318609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:10.318774+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:11.318939+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:12.319083+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:13.319251+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:14.319409+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:15.319508+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:16.319621+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:17.319778+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb237c5c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb2397e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb23c1f0e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb257772c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb252b0000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.957300186s of 26.056312561s, submitted: 38
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e63680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22622400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1ed20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb22a68d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb239183c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:18.319897+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:19.320023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:20.320180+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:21.320304+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:22.320463+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a1a1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217690 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:23.320608+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115064832 unmapped: 46784512 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:24.320710+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 43900928 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:25.320839+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:26.320960+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:27.321110+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:28.321266+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:29.321499+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:30.321727+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:31.321902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:32.322026+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 36823040 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:33.322165+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 36814848 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.868372917s of 16.010391235s, submitted: 40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:34.322308+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130957312 unmapped: 30892032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93b2000/0x0/0x4ffc00000, data 0x1deddff/0x1ea9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:35.322456+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131473408 unmapped: 30375936 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb258243c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25825e00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb258252c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816fc00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816fc00 session 0x55cb258250e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:36.322636+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25824f00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131596288 unmapped: 30253056 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:37.322855+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1516856 data_alloc: 234881024 data_used: 18149376
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:38.323023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24f374a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:39.323167+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8737000/0x0/0x4ffc00000, data 0x2a68dff/0x2b24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24f361e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:40.324313+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:41.325002+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb24f37680
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816f000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:42.325105+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816f000 session 0x55cb24f37a40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1519196 data_alloc: 234881024 data_used: 18149376
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:43.325649+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8712000/0x0/0x4ffc00000, data 0x2a8ce32/0x2b4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:44.325874+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 133881856 unmapped: 27967488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:45.326053+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140607488 unmapped: 21241856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:46.326243+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.067866325s of 12.396329880s, submitted: 138
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:47.326688+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:48.327117+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:49.327482+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:50.327639+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:51.327819+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:52.328027+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:53.328180+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:54.328605+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140648448 unmapped: 21200896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:55.328888+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143187968 unmapped: 18661376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7d06000/0x0/0x4ffc00000, data 0x3498e32/0x3556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:56.329149+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.850776672s of 10.000297546s, submitted: 56
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143392768 unmapped: 18456576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:57.329267+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1680754 data_alloc: 251658240 data_used: 29532160
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:58.329400+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:19:59.329664+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:00.329854+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:01.330075+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:02.330264+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143597568 unmapped: 18251776 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679274 data_alloc: 251658240 data_used: 29532160
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:03.330473+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:04.330663+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:05.330867+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:06.331007+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:07.331194+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:08.331399+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:09.331552+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:10.331689+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:11.331881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:12.332010+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:13.332157+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.070764542s of 17.136646271s, submitted: 20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:14.332433+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb257c94a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25678960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:15.332608+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136912896 unmapped: 24936448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb22a51c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:16.332743+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:17.332855+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1445716 data_alloc: 234881024 data_used: 18149376
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:18.332981+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:19.333071+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb252a25a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a1be00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:20.333191+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136953856 unmapped: 24895488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24992960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:21.333365+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:22.333585+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:23.333732+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:24.333888+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:25.334097+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:26.334247+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:27.334382+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:28.334492+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:29.334640+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:30.334770+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:31.334889+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:32.335063+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:33.335215+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:34.335379+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:35.335554+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:36.335747+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:37.335891+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:38.336078+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:39.336258+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:40.336457+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:41.336626+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:42.336787+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:43.336916+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:44.337692+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:45.337970+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:46.338100+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24b1c960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24c6a1e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 37937152 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24c6ba40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e400
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24a941e0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.266139984s of 33.476127625s, submitted: 90
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95c20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb257c8d20
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51e00
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:47.338273+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25717a40
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb25717860
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:48.339354+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:49.340241+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:50.341881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:51.342009+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:52.343596+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:53.344229+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb222fb800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a60000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:54.344513+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb22b01000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:55.344861+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:56.345321+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:57.345464+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:58.346096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:20:59.346269+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:00.346590+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:01.346739+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:02.347198+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:03.347411+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:04.347711+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:05.347858+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.718759537s of 18.834480286s, submitted: 44
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125886464 unmapped: 35962880 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:06.348080+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129695744 unmapped: 32153600 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:07.348189+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:08.348472+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:09.348621+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:10.348866+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:11.349095+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:12.349312+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:13.349488+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:14.349805+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:15.350049+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:16.350254+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704682350s of 10.921176910s, submitted: 89
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24a612c0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 32022528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb251e9800
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:17.350386+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25678960
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:18.350515+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:19.350643+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:20.350899+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:21.351032+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:22.351152+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:23.351359+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:24.351506+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:25.351634+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:26.351818+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:27.352015+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:28.352158+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:29.352330+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:30.352534+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:31.352738+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:32.352854+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:33.352965+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:34.353085+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:35.353199+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:36.353356+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:37.353507+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:38.353617+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:39.353789+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:40.353953+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:41.354087+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:42.354290+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:43.354459+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:44.354618+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:45.354759+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:46.354921+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:47.355062+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:48.355309+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:49.355427+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:50.355561+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:51.355727+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:52.355895+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:53.356008+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:54.356305+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:55.356453+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:56.356574+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:57.356711+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:58.356861+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:21:59.356993+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:00.357119+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:01.357278+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:02.357411+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:03.357578+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:04.357794+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:05.357951+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:06.358149+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:07.358273+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:08.358456+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:09.358597+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:10.358780+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:11.358913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:12.359056+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:13.359172+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:14.359315+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:15.359442+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:16.359634+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:17.359876+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:18.360177+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:19.360341+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:20.360462+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:21.360607+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:22.360753+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:23.360869+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:24.361014+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:25.361145+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:26.361270+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:27.361354+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125870080 unmapped: 35979264 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:28.361521+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125419520 unmapped: 36429824 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:29.361977+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125558784 unmapped: 36290560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:30.362096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125566976 unmapped: 36282368 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf dump' '{prefix=perf dump}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:31.362210+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf schema' '{prefix=perf schema}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:32.362366+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:33.362488+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:34.362645+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:35.362813+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:36.362911+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:37.363027+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:38.363157+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:39.363289+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:40.363417+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:41.363554+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:42.363706+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:43.363870+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:44.364044+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:45.364158+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:46.364276+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:47.364342+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:48.364411+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:49.364544+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:50.364678+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:51.364805+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:52.364893+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:53.365013+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:54.365185+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:55.365345+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:56.365484+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:57.365614+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:58.365735+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:22:59.365893+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:00.366003+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:01.366119+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:02.366233+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:03.366342+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:04.366710+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:05.366921+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:06.367057+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:07.367264+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:08.367435+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:09.367589+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:10.367715+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:11.368455+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:12.368574+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:13.368703+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:14.368898+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:15.369042+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:16.369162+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:17.369280+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:18.369430+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:19.369577+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:20.369705+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:21.370004+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:22.370135+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:23.370262+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:24.370419+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:25.370534+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:26.370688+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:27.370865+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:28.371078+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:29.371238+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:30.371405+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:31.371521+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:32.371681+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:33.371841+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:34.372029+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:35.372207+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:36.372374+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:37.372507+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:38.372690+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:39.372962+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:40.373117+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:41.373256+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:42.373410+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:43.373595+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:44.373775+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:45.373963+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:46.374110+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:47.374242+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:48.374403+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:49.374536+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:50.374681+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:51.374912+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:52.375052+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:53.375173+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:54.375333+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:55.375499+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:56.375620+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:57.375772+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:58.375970+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:23:59.376896+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:00.377083+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:01.377820+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:02.378089+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:03.378326+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:04.378565+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:05.379007+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:06.379294+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:07.379678+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:08.379959+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:09.380353+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:10.380524+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:11.380787+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:12.380947+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:13.381207+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:14.381381+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:15.381638+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:16.381786+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:17.381992+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:18.382198+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:19.382411+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:20.382597+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:21.382750+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:22.382881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:23.383029+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:24.383244+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:25.383431+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:26.383578+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:27.383727+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:28.383881+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:29.384158+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:30.384354+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:31.384481+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:32.384648+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:33.384801+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:34.385063+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:35.385293+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:36.385461+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:37.385667+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:38.385902+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:39.386033+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:40.386156+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:41.386299+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:42.386607+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:43.386765+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:44.386916+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:45.387116+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125837312 unmapped: 36012032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:46.387249+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:47.387457+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:48.387617+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:49.387800+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:50.388002+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:51.388175+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:52.388334+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:53.388572+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:54.388800+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:55.388956+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:56.389106+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:57.389279+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:58.389422+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:24:59.389606+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:00.389756+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:01.389926+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:02.390034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:03.390216+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:04.390471+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:05.390623+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:06.390814+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:07.391035+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:08.391170+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:09.391302+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:10.391462+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:11.391630+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:12.391723+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:13.392004+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:14.392248+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:15.392382+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:16.392566+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:17.392687+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:18.392817+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:19.393011+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 10K writes, 3043 syncs, 3.52 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2448 writes, 8982 keys, 2448 commit groups, 1.0 writes per commit group, ingest: 9.37 MB, 0.02 MB/s
                                           Interval WAL: 2448 writes, 1024 syncs, 2.39 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:20.393182+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:21.393320+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:22.393440+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:23.393580+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:24.393787+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:25.393949+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:26.394107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:27.394287+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:28.394428+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:29.394582+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:30.394730+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:31.394872+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:32.395045+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:33.395168+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:34.395357+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:35.395510+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:36.395699+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _send_mon_message to mon.compute-2 at v2:192.168.122.102:3300/0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:37.395892+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:38.396096+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:39.396240+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:40.396422+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:41.396564+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:42.396748+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:43.396913+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:44.397081+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:45.397262+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:46.397438+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:47.397591+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:48.397775+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:49.397924+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:50.398040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:51.398303+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:52.398486+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:53.398721+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:54.398958+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:55.399135+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:56.399257+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:57.399422+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:58.399605+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:25:59.399734+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:00.399880+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:01.400032+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:02.400168+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:03.400306+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:04.400512+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:05.400659+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 36585472 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:06.400798+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:07.401009+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:08.401208+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:09.401481+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:10.401645+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:11.401787+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:12.401958+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:13.402093+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:14.402277+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:15.402420+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:16.402560+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:17.402672+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:18.402872+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:19.403040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:20.403180+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:21.403361+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:22.403531+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:23.403651+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:24.403810+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:25.404030+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:26.404154+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:27.404339+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:28.404513+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:29.404639+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:30.404798+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:31.405008+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:32.405169+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:33.405444+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 317.378509521s of 317.484130859s, submitted: 36
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:34.405697+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 36552704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:35.405809+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [0,1])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125493248 unmapped: 36356096 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:36.405964+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125616128 unmapped: 36233216 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:37.406253+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125616128 unmapped: 36233216 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:38.406404+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:39.406573+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:40.406777+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:41.406956+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:42.407086+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:43.407272+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:44.407457+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:45.407589+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:46.407789+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:47.407970+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:48.408170+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:49.408358+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:50.408581+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:51.408797+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:52.409062+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:53.409222+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:54.409459+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:55.409788+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:56.410036+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:57.410189+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:58.410377+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:26:59.410590+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:00.410728+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:01.410951+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 36200448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:02.411105+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 36200448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:03.411270+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:04.411459+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:05.411590+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:06.411774+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:07.411957+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:08.412144+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:09.412330+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:10.412477+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:11.412622+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:12.412759+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:13.412922+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:14.413080+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:15.413266+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:16.413452+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:17.413612+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:18.413771+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:19.413937+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:20.414058+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:21.414199+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:22.414363+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:23.414537+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:24.414739+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:25.414882+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:26.415009+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:27.415177+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:28.415339+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:29.415463+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:30.415627+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:31.415796+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:32.415931+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:33.416104+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:34.416332+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:35.416495+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:36.416611+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:37.416795+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:38.416984+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:39.417260+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:40.417409+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:41.417552+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:42.417701+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:43.418056+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:44.418355+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:45.418702+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:46.418899+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:47.419220+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:48.419500+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:49.419813+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:50.420143+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:51.420372+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:52.420555+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:53.420692+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:54.421015+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:55.421220+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:56.421369+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:57.421659+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:58.421953+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:27:59.422277+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:00.422487+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:01.422711+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:02.422933+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:03.423141+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:04.423341+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:05.423503+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:06.423645+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:07.423905+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:08.424108+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:09.424295+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:10.424484+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:11.424684+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:12.424939+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:13.425149+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:14.425405+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:15.426107+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:16.426922+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:17.427779+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:18.427944+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:19.429138+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:20.429731+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:21.429907+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:22.430437+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:23.431160+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:24.431668+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:25.431975+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:26.432109+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:27.432707+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:28.433057+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:29.433217+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:30.433465+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:31.433653+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:32.433786+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:33.433920+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:34.434126+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:35.434308+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:36.434443+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:37.434731+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:38.434862+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:39.435034+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:40.435261+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:41.435491+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:42.435649+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:43.435789+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:44.436041+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:45.436226+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:46.436426+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:47.436576+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:48.436725+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:49.436904+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:50.437046+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:51.437188+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:52.437369+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:53.437487+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:54.437660+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:55.437905+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:56.438158+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:57.438379+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:58.438518+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:28:59.438697+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:00.438894+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:01.439040+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:02.439127+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:03.439262+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:04.439443+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:05.439609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:06.439792+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:07.439956+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:08.440070+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:09.440229+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:10.440400+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:11.440534+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:12.440672+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:13.440812+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:14.441066+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:15.441243+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:16.441442+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:17.441596+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:18.441754+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:19.441898+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:20.442410+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:21.443439+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c585a0
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: handle_auth_request added challenge on 0x55cb2816e000
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:22.443931+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:23.444191+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:24.444806+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:25.445168+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:26.445390+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:27.445704+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:28.445957+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:29.446219+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:30.446484+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:31.446609+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:32.446747+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:33.446879+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:34.447106+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:35.447246+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:36.447388+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:37.447550+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:38.447669+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:39.447783+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:40.447890+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [3])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:41.448082+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:42.448352+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:43.448600+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:44.448819+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:45.449060+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:46.449277+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:47.449500+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:48.449681+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:49.449907+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:50.450124+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:51.450313+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:52.450540+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:53.450789+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:54.451087+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:55.451226+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:56.451420+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:57.451675+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:58.451857+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:29:59.452043+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:00.452284+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:01.452525+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:02.452771+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:03.453023+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:04.453232+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:05.453370+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:06.453573+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:07.453783+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:08.453953+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:09.454127+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:10.454294+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:11.454503+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:12.454646+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:13.454784+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:14.455010+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 10:30:48 compute-2 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 10:30:48 compute-2 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:15.455199+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:16.455317+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: tick
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_tickets
Oct 10 10:30:48 compute-2 ceph-osd[77423]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-10-10T10:30:17.455444+0000)
Oct 10 10:30:48 compute-2 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 36716544 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 10:30:48 compute-2 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 10:30:48 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 10:30:48 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2013373529' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.27967 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1085433634' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/149990470' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.18420 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.27979 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1680992890' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1585167422' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2303646199' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2469062832' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.18441 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3231707591' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2013373529' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:48 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:48 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:49 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:49 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:49 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:49.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 10:30:49 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092835265' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 10:30:49 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2000313604' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Oct 10 10:30:49 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3657722192' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:30:49 compute-2 crontab[261615]: (root) LIST (root)
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.27179 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.27994 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.18468 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3444631537' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.27191 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.28015 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3721446131' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.18486 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1092835265' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/724274365' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.27209 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.28033 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2000313604' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3657722192' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:30:49 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3730549091' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 10:30:49 compute-2 sshd-session[260008]: Failed password for root from 80.94.93.119 port 50722 ssh2
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:49 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:49 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:50 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:50 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:50 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:50.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:50 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 10:30:50 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2779081294' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:30:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:50 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:50 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.18501 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.27227 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.28054 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.18519 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/598726768' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.27239 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.28075 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3542325360' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.18525 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3712051660' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2779081294' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/838423744' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 10:30:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3755513474' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:30:51 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:51 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:51 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:51.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:51 compute-2 nova_compute[235775]: 2025-10-10 10:30:51.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Oct 10 10:30:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2752456730' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Oct 10 10:30:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/251813461' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:30:51 compute-2 sshd-session[260008]: Received disconnect from 80.94.93.119 port 50722:11:  [preauth]
Oct 10 10:30:51 compute-2 sshd-session[260008]: Disconnected from authenticating user root 80.94.93.119 port 50722 [preauth]
Oct 10 10:30:51 compute-2 sshd-session[260008]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 10 10:30:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 10:30:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/821557892' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 10:30:51 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1210927648' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:30:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:51 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:51 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:52 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:52 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:52 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:52.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.27245 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.28099 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.18540 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.28117 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3755513474' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.27266 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.18564 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1286859206' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2752456730' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.28132 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.27281 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4144306905' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.18582 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/251813461' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/821557892' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1210927648' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Oct 10 10:30:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3064887336' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Oct 10 10:30:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2197365172' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Oct 10 10:30:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2319149536' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:30:52 compute-2 unix_chkpwd[262021]: password check failed for user (root)
Oct 10 10:30:52 compute-2 sshd-session[261896]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Oct 10 10:30:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Oct 10 10:30:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1413997151' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Oct 10 10:30:52 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2285275171' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:30:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:52 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:52 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:53 compute-2 ceph-mon[74913]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.27296 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.18603 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2364379109' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3064887336' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.27308 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2197365172' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1586037795' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/865574371' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2672443346' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2319149536' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2221259386' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1413997151' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2285275171' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2579466127' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1814715029' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3898864091' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2844202342' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:30:53 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:53 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:53 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:53.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:53 compute-2 nova_compute[235775]: 2025-10-10 10:30:53.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:53 compute-2 systemd[1]: Starting Hostname Service...
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/112314830' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 systemd[1]: Started Hostname Service.
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4280476709' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Oct 10 10:30:53 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/853849376' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:30:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:53 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:53 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:54 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:54 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:54 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:54.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.27326 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1814715029' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1351759873' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/4094752344' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3898864091' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2844202342' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1182933054' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/112314830' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/4229550321' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1828719708' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/847432332' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4280476709' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1369088152' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/853849376' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/29899441' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1052434041' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/827572701' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Oct 10 10:30:54 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2382901546' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:30:54 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 10:30:54 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2698340361' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:30:54 compute-2 podman[262322]: 2025-10-10 10:30:54.81071001 +0000 UTC m=+0.080463380 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 10:30:54 compute-2 podman[262324]: 2025-10-10 10:30:54.825709769 +0000 UTC m=+0.091725319 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid)
Oct 10 10:30:54 compute-2 podman[262323]: 2025-10-10 10:30:54.868677622 +0000 UTC m=+0.137401139 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:54 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:54 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:55 compute-2 ceph-mon[74913]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1625571292' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.28288 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2382901546' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/84867561' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2698340361' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3675716813' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/109351117' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1162105813' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2686277470' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1820525157' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1466512300' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:30:55 compute-2 sshd-session[261896]: Failed password for root from 80.94.93.119 port 38682 ssh2
Oct 10 10:30:55 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:55 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:55 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:55.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 10:30:55 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Oct 10 10:30:55 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3888181591' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:30:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:55 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:55 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Oct 10 10:30:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2207188954' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:30:56 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:56 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:56 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:56.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.28312 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.28318 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.28336 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3951488466' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/2884419151' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.18726 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3980240607' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/3888181591' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1094418802' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/2207188954' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:30:56 compute-2 nova_compute[235775]: 2025-10-10 10:30:56.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 10:30:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/988730199' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 unix_chkpwd[262640]: password check failed for user (root)
Oct 10 10:30:56 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Oct 10 10:30:56 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1923095157' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:30:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:56 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:56 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.28357 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.27446 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.18750 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.28372 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.18759 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.27458 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.27464 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.28384 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.18780 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/988730199' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/478578613' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1923095157' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:57 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:57 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:57 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:57 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:57.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:57 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:57 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:58 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:58 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 10:30:58 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:58.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.27470 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.18798 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.18807 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.27488 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.28420 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.18825 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3022188802' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/3453675055' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.27506 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/477626751' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/2053950729' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1478406306' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1622925211' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/3424359302' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 10:30:58 compute-2 nova_compute[235775]: 2025-10-10 10:30:58.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Oct 10 10:30:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:58 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:58 compute-2 sshd-session[261896]: Failed password for root from 80.94.93.119 port 38682 ssh2
Oct 10 10:30:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:58 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:58 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 10:30:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Oct 10 10:30:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4055382231' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.18858 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 1 op/s
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.27527 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.18876 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.28471 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.27539 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.18897 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/102737578' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1807734272' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/1691596010' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/4055382231' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 10:30:59 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:30:59 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:30:59 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:59.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:30:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Oct 10 10:30:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1183362410' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Oct 10 10:30:59 compute-2 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1997365182' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:30:59 compute-2 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:59 2025: (VI_0) received an invalid passwd!
Oct 10 10:31:00 compute-2 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 10:31:00 compute-2 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 10:31:00 compute-2 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:31:00.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.27557 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.18954 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1183362410' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.101:0/1510796144' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/203338503' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.102:0/1997365182' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 10:31:00 compute-2 ceph-mon[74913]: from='client.? 192.168.122.100:0/525258780' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
